Button Text

Boss

BOSS

Boss

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

Buzz

“Safer Scrolling”: Social Media's Role In Normalizing Harmful Ideologies
Published in  
Buzz
 on  
May 6, 2024

“Safer Scrolling”: Social Media's Role In Normalizing Harmful Ideologies

Social media algorithms normalise misogyny but deplatforming can help disrupt influencers and foster positive gender norms.

Is Social Media Normalising Misogyny and other harmful ideologies?

A recent study by University College London and the University of Kent reveals alarming trends regarding social media's impact on young people, particularly in amplifying misogynistic content. The study, focusing on TikTok, observed a concerning four-fold increase in suggested misogynistic content over just five days, including angry and blaming videos targeting women. This surge, according to researchers, indicates a broader trend across social media platforms. The study, titled 'Safer Scrolling,' argues that social media algorithms are instrumental in pushing harmful content to teenagers, masquerading it as entertainment. This normalization of toxic and hateful ideologies, particularly harmful to boys with anxiety and mental health issues, is a grave concern.

The research methodology involved interviews with young content consumers, setting up TikTok accounts to monitor content suggestions over a week, revealing a shift from initially selected interests to increasingly misogynistic themes. Moreover, discussions with school leaders highlighted the normalization of such harmful tropes in offline interactions. Geoff Barton, from the Association of School and College Leaders, emphasized the snowball effect of algorithms in propagating extreme content, urging vigilance against the impact of toxic masculinity messages on young minds.

What is the research about? 

The research started with interviews with young individuals engaged in and producing radical online content, creating archetypes to represent vulnerable teenage boys susceptible to radicalization. The study observed a stark shift in content recommendations on TikTok's 'For You' page, initially aligned with users' interests but gradually transitioning to an alarming 56% surge in misogynistic content, including objectification and blaming women for men's issues. Dr. Kaitlyn Regehr from UCL highlighted how algorithms target vulnerabilities like loneliness, gamifying harmful content as entertainment.

The absence of TikTok in India does not render this study irrelevant, as similar patterns of algorithmic amplification of harmful content are observed across platforms like YouTube, Instagram, and Facebook, affecting users globally. Social media has evolved into an echo chamber, where users often encounter ideas and content that reinforce their existing beliefs and biases. This phenomenon creates a vacuum-like environment, where individuals are exposed to a limited range of perspectives, contributing to the amplification of harmful ideologies and online radicalization. The rise of influencers like Andrew Tate and surveys indicating declining support for women's rights paint a troubling picture of online radicalization. Despite efforts to ban controversial accounts, the slow response to hateful content's removal makes the issue worse. The algorithm also amplifies and distributes harmful ideologies. This phenomenon, coupled with adolescents' susceptibility to extremism and excessive screen time, poses serious questions about the long-term effects of online radicalization and the responsibility of social media platforms in curbing its impact. 

The Impact on the Audience

Misogyny influencers attract audiences through a combination of push, pull, and personal factors. Let’s understand these better:

  1. Push factors arise from societal situations that make their content resonate, such as perceived disadvantages faced by men due to women's advancements. For instance, the perception that women are excelling in education and careers while men are left behind can fuel resentment, making misogynistic content appealing. 
  2. Pull factors encompass tactics used by influencers to enhance appeal, like visually appealing content and social media manipulation. They create emotional responses through extreme messages while fostering a community of like-minded individuals. 
  3. Personal factors determine vulnerability levels among young men. Those feeling pressured by traditional masculinity norms or social isolation may be more susceptible to such content. For example, boys who view masculinity as tied to dominance or sexual prowess might resonate with messages promoting traditional gender roles.

Misogyny influencers offer an idealized version of masculinity as a solution to societal challenges and personal insecurities. They critique progressive gender politics and advocate for traditional gender roles, presenting a celebratory view of masculinity while legitimizing male grievances. However, this portrayal of masculinity is often unrealistic and divisive, appealing more to rebellion than genuine aspiration. While some boys may find elements of this narrative attractive, others hold nuanced views about masculinity and may not fully endorse such extreme ideologies. It's crucial to understand that pushing back against misogyny influencers solely through criticism may reinforce their appeal. Instead, offering credible alternatives and engaging in meaningful dialogue with young men can be more effective in challenging harmful beliefs and fostering positive gender norms. 

What can be done about it?

Deplatforming(1) can effectively curb an individual's actions or remove them from public visibility, though eradicating all their online content remains challenging. However, by cutting revenue streams on social media, platforms can significantly impact how such individuals sustain themselves and propagate their ideologies. Does deplatforming completely eradicate misogynistic or gender-violent rhetoric? Not entirely. Nevertheless, it does disrupt the influence of some of its most vocal proponents. Although Tate's ban from mainstream media doesn't signify the end of his message's resonance, it underscores the power of public pressure on platforms to take corrective action.

Footnotes:

(1): Deplatforming refers to the action of removing an individual or organization from a platform, typically a social media or online platform, thereby restricting their ability to reach and engage with an audience. This can involve suspending or banning accounts, removing content, or limiting visibility and access to certain features or services.

References:

Wion | The Conversation | Scroll | The Atlantic

No items found.