Techno Blender
Digitally Yours.

Social media algorithms pushing, normalising extreme misogynistic content on young people, finds study

0 21


A joint study conducted by teams from University College London and the University of Kent have found that social algorithms that drive social media platforms and their discovery feeds have a propensity to push and normalise highly misogynistic content

A recent report by The Guardian sheds light on the alarming trend of social media algorithms rapidly propagating extreme misogynistic content, which is infiltrating school environments and normalizing harmful attitudes towards women.

The report is based on a study conducted by teams from University College London and the University of Kent, which reveals a concerning surge in misogynistic content suggested by TikTok’s algorithm over a five-day monitoring period.

This content, often centred on anger and blame directed at women, saw a four-fold increase, indicating a worrying trend in the platform’s recommendation system.

Related Articles

Blood

Blood on their hands: Mark Zuckerberg apologises to parents at Senate hearing for endangering children

Blood

India tops Snapchat’s Digital Well-Being Index, 60% of parents check if their kids are safe online

While the focus of this research was TikTok, experts suggest that similar patterns likely extend to other social media platforms. They advocate for a nuanced approach to addressing the issue, promoting a “healthy digital diet” over blanket bans on phones or social media platforms, which are deemed ineffective.

The report comes amidst growing concerns over the impact of social media on young people, with recent studies indicating a generational divide in attitudes towards feminism. Additionally, calls for stricter regulations on smartphone use among minors have gained momentum following tragic incidents linked to online activities.

According to the findings, social media algorithms play a pivotal role in presenting harmful content as entertainment, thereby influencing young users’ perceptions and behaviours. The researchers stress that toxic ideologies, once confined to online spaces, are now permeating school environments and mainstream youth cultures.

Geoff Barton, from the Association of School and College Leaders, highlights the insidious nature of algorithmic processes, urging social media platforms to reassess their algorithms and strengthen safeguards to counter the proliferation of harmful content.

Andy Burrows, adviser to the Molly Rose Foundation, echoes these sentiments, emphasizing the urgent need for regulatory intervention to curb the dissemination of harmful content targeting vulnerable teens.

Responding to these concerns, Prime Minister Rishi Sunak reaffirmed the government’s commitment to online safety, citing the recently passed Online Safety Act as a crucial step in holding social media companies accountable for protecting children from harmful content.

In light of the report’s findings, TikTok asserts its commitment to combating misogyny on its platform, noting its proactive content moderation measures. However, critics argue that the methodology employed in the report fails to capture the real impact of harmful content on users.

With the Online Safety Act set to empower regulators to tackle online harms, including misogyny, authorities emphasize the importance of addressing content that disproportionately affects women and girls, signalling a concerted effort to safeguard users’ safety and rights in the digital realm.


Social media algorithms pushing, normalising extreme misogynistic content on young people, finds study

A joint study conducted by teams from University College London and the University of Kent have found that social algorithms that drive social media platforms and their discovery feeds have a propensity to push and normalise highly misogynistic content

A recent report by The Guardian sheds light on the alarming trend of social media algorithms rapidly propagating extreme misogynistic content, which is infiltrating school environments and normalizing harmful attitudes towards women.

The report is based on a study conducted by teams from University College London and the University of Kent, which reveals a concerning surge in misogynistic content suggested by TikTok’s algorithm over a five-day monitoring period.

This content, often centred on anger and blame directed at women, saw a four-fold increase, indicating a worrying trend in the platform’s recommendation system.

Related Articles

Blood

Blood on their hands: Mark Zuckerberg apologises to parents at Senate hearing for endangering children

Blood

India tops Snapchat’s Digital Well-Being Index, 60% of parents check if their kids are safe online

While the focus of this research was TikTok, experts suggest that similar patterns likely extend to other social media platforms. They advocate for a nuanced approach to addressing the issue, promoting a “healthy digital diet” over blanket bans on phones or social media platforms, which are deemed ineffective.

The report comes amidst growing concerns over the impact of social media on young people, with recent studies indicating a generational divide in attitudes towards feminism. Additionally, calls for stricter regulations on smartphone use among minors have gained momentum following tragic incidents linked to online activities.

According to the findings, social media algorithms play a pivotal role in presenting harmful content as entertainment, thereby influencing young users’ perceptions and behaviours. The researchers stress that toxic ideologies, once confined to online spaces, are now permeating school environments and mainstream youth cultures.

Geoff Barton, from the Association of School and College Leaders, highlights the insidious nature of algorithmic processes, urging social media platforms to reassess their algorithms and strengthen safeguards to counter the proliferation of harmful content.

Andy Burrows, adviser to the Molly Rose Foundation, echoes these sentiments, emphasizing the urgent need for regulatory intervention to curb the dissemination of harmful content targeting vulnerable teens.

Responding to these concerns, Prime Minister Rishi Sunak reaffirmed the government’s commitment to online safety, citing the recently passed Online Safety Act as a crucial step in holding social media companies accountable for protecting children from harmful content.

In light of the report’s findings, TikTok asserts its commitment to combating misogyny on its platform, noting its proactive content moderation measures. However, critics argue that the methodology employed in the report fails to capture the real impact of harmful content on users.

With the Online Safety Act set to empower regulators to tackle online harms, including misogyny, authorities emphasize the importance of addressing content that disproportionately affects women and girls, signalling a concerted effort to safeguard users’ safety and rights in the digital realm.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment