TikTok and YouTube Shorts push misogynistic videos to young male watchers, study finds

FILE - TikTok logo on a phone (left); YouTube app (right). ©AP Photo/Michael Dwyer (left); AP Photo/Patrick Semansky (right)

It takes on average 23 to 26 minutes of video watching for TikTok and YouTube Shorts to recommend toxic or misogynistic content to the accounts of young men, according to a new study.

The study from Dublin City University tracked the content recommended to 10 “sockpuppet” TikTok and YouTube Short accounts created by the researchers on new smartphones.

The accounts were all directed to show the search interests of 16 and 18-year-old boys either with regular content, like sports or video games, or to replicate those that purposely look for misogynistic content online.

The research found that it took roughly 23 minutes of video watching on both TikTok and YouTube Shorts for the algorithms to start recommending “toxic” content and 26 minutes to recommend “manfluencer” (male influencer) content across the different accounts.

The recommended videos could sometimes come as soon as after two minutes of viewing on YouTube Shorts and 10 minutes on TikTok for accounts that showed some interest in learning more about manfluencers – videos that are widely considered to be promoting alpha male and anti-feminist ideas.

“The findings of this report point to urgent and concerning issues for parents, teachers, policymakers, and society as a whole,” the report reads.

‘Monetisation of male insecurity’

The researchers watched nearly 29 hours of video over the 10 accounts to analyse the content of the videos that were being recommended.

The vast majority of content being suggested after two to three hours, or 400 video watches, was problematic or toxic, according to the researchers.

Once one manfluencer video was recommended and then watched by the young male account, it became a lot more likely to be recommended.

The report identified three major themes of these manfluencer videos: crisis narratives, like masculinity and the “nuclear family” are under threat; motivational videos that convince men that feeling emotions or depression can be emasculating; or debunked gender science videos that show concepts from evolutionary psychology that men and women are “hardwired” for different gender roles.

The study suggests there is also a link between manfluencer videos and right-wing conspiracy content: 13 per cent of all recommended content on TikTok and five per cent on YouTube for these accounts included these concepts.

“This monetisation of male insecurity not only serves to mainstream anti-feminist and anti-LGBTQ ideology, but may also function as a gateway to fringe Far-Right and other extreme worldviews,” the report reads.

One of the limitations of their study, the report continued, is the lack of transparency from social media companies about how their algorithms work.

That means they are missing critical information about how the platforms craft personalised content suggestions based on their previous viewing history.

Study not ‘reflective’ of TikTok user experience

YouTube Shorts was the platform to push the highest amount of "toxic content", at about 61 per cent compared to 34 per cent of TikTok’s recommendations.

The platform also fed manfluencer-curious accounts more toxic content than accounts that searched for "generic" topics.

YouTube did not respond to a request for comment.

TikTok said in an emailed statement that the Dublin City University report does not reflect how their user base would experience videos on their platform. The statement also noted that the sample size in the study is extremely limited, both in the number of accounts used and the amount of video viewed.

They say toxic content makes up 34.7 per cent of what users would see on their feeds.

TikTok also says they do not allow hate speech or hateful discourse, like misogyny and transphobia on their platform and remove content that violates their community guidelines.

“If we become aware that any such actor may be on our platform, we will conduct a thorough review – including off-platform behaviour – which may result in an account ban,” TikTok’s community guidelines webpage reads.

The company says they do the same thing with their recommended videos in the “For You” feed, where the recommendation system will substitute less similar content into a person’s feed if they find two videos are a little too similar.

Users can also curate what they see by noting that a type of video does not interest them, by refreshing their feeds or filtering out certain keywords.

TikTok is currently under two investigations under the EU's Digital Services Act dealing with “the protection of minors” and “addictive design” for the platform’s new TikTok Lite, which recently launched in France and Spain.

Last November, YouTube was asked by the European Commission to provide information on how it is protecting minors online under the new Digital Services Act.

Account-based content moderation needed

The report highlights some key recommendations for schools, parents, and social media companies.

It suggests that social media companies not only do content moderation based on the videos but also regulate what accounts are able to post.

Ireland has a new media regulator, called the Coimisiún na Meán, which the report suggests social media companies work with to "highlight illegal, harmful and borderline content".

For schools, the report suggests promoting positive male role models in the classroom should be prioritised as a way to “promote an educative rather than punitive response to boys’ behaviours”. It also suggests schools focus on promoting critical digital literacy skills.

Parents could also open discussions with their teenage boys to understand why they idealise certain influencers and encourage them to “engage with relatable resources”.

TikTok said in their statement that they have parental resources to help with those conversations.

© Euronews