Instagram Is The Most Important Platform For Pedophile Networks: Report

By Hera Rizwan

According to a reportby Stanford University and The Wall Street Journal, published on Tuesday, Instagram is the main platform used by pedophile networks to promote and sell content showing child sexual abuse. The report, which goes by the name of 'Cross-Platform Dynamics of Self-Generated CSAM' has been published by the Stanford University Cyber Policy Centre.

The study notes that adult-generated Child Sexual Abuse Material (CSAM) does not encompass all instances of online child sexual exploitation. It is the Self-Generated Child Sexual Abuse Material (SG-CSAM) which often misses the radar. SG-CSAM is when an image or video appears to be created by a minor. Using specific hashtags and keywords commonly used in the community, the Policy Centre assessed the scope and scale of the practice, while examining how platforms are succeeding or failing in detecting and suppressing SG-CSAM.

The study analysed scale of CSAM available in the domain of online communication and social media platforms like Instagram, Twitter, TikTok, Snapchat, Telegram and Discord. Amongst these Instagram has "a particularly severe problem with commercial SG-CSAM accounts, and many known CSAM keywords return results," the report read.

Also Read:Interview: Why Indians Like Watching Scripted Videos

What are the key findings of the study?

-The study identified 405 accounts advertising the sale of self-generated CSAM on Instagram, and 128 such accounts were traced on on Twitter. 58 accounts within the Instagram follower network appeared to be probable content buyers who used their real names, many of which were matched to Facebook, LinkedIn or TikTok profiles.

\- A month after the identification of these accounts, only 31 of the Instagram seller accounts and 22 of the Twitter ones remained active. However, "in the intervening time, hundreds of new SG-CSAM accounts were created, recreated or activated on both platforms".

\- The study notes that while it is possible that some seller accounts are impersonators redistributing content, scammers, or even victims of child exploitation, it seems that most underage sellers are creating and marketing the content on their own accord.

\- The monetary transactions take place through CashApp, PayPal, or through gift cards to companies and services such as Amazon, PlayStation Network or DoorDash.

\- The majority of sellers mention their age in their profile bios, either explicitly or indirectly, using symbols like emoji or simple equations. Most self-identified as between the ages of 13 and 17, according to their bios.

\- While sellers market their content on Instagram and Twitter, the actual content delivery appears to happen on file sharing services such as Dropbox or Mega, after negotiations on DM. "The DM conversations are redacted, screen captured, and subsequently posted to the main account profile as Stories to bolster the authenticity of the seller," the study said.

-The kind of videos which are up for sale include, self-harm videos with and without explicit nudity, advertisements for paid in-person sexual acts (some of which is then recorded and sold to other customers) and imagery of the minor performing sexual acts with animals.

How has Instagram become the platform for pedophiles?

The Stanford report throws light on how the Meta-owned photo and video sharing platform allows the pedophilic content to feature on it without being detected or suppressed. According to the report, "Instagram is currently the most important platform for these networks with features like recommendation algorithms and direct messaging that help connect buyers and sellers."

The report notes that while many CSAM keywords returned results, search results for some terms returned with a warning text which could be surpassed by clicking on “see results anyway”. Post the search, the Instagram’s user suggestion recommendation system also readily promotes other SG-CSAM accounts to users, thereby, allowing for account discovery without keyword searches.

Alluding to such Instagram accounts which are between the range of 500 and 1000 at a given time, the report found that they "often have one or no actual posts, but will frequently post stories with content menus, promotions or cross-site links".

Instagram, which comes within the ambit of Meta's policy rules prohibiting sexualisation of children, advertising of CSAM, sexual conversations with minors and obtaining sexual material from minors, clearly lags behind diligently following the comprehensive rules. "Instagram’s role as the key platform in our investigation is likely not due to a lack of policies, but ineffective enforcement, " the report read.

Also Read:Only 31% Of Women In India Use Mobile Internet, Says Report

© BOOM Live