Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Explainers

Instagram Is The Most Important Platform For Pedophile Networks: Report

A recent report from Stanford University reveals Instagram's alarming role in promoting and selling child sexual abuse content.

By - Hera Rizwan | 9 Jun 2023 4:26 PM IST

According to a report by Stanford University and The Wall Street Journal, published on Tuesday, Instagram is the main platform used by pedophile networks to promote and sell content showing child sexual abuse. The report, which goes by the name of 'Cross-Platform Dynamics of Self-Generated CSAM' has been published by the Stanford University Cyber Policy Centre.

The study notes that adult-generated Child Sexual Abuse Material (CSAM) does not encompass all instances of online child sexual exploitation. It is the Self-Generated Child Sexual Abuse Material (SG-CSAM) which often misses the radar. SG-CSAM is when an image or video appears to be created by a minor. Using specific hashtags and keywords commonly used in the community, the Policy Centre assessed the scope and scale of the practice, while examining how platforms are succeeding or failing in detecting and suppressing SG-CSAM. 

The study analysed scale of CSAM available in the domain of online communication and social media platforms like Instagram, Twitter, TikTok, Snapchat, Telegram and Discord. Amongst these Instagram has "a particularly severe problem with commercial SG-CSAM accounts, and many known CSAM keywords return results," the report read.

What are the key findings of the study?

-The study identified 405 accounts advertising the sale of self-generated CSAM on Instagram, and 128 such accounts were traced on on Twitter. 58 accounts within the Instagram follower network appeared to be probable content buyers who used their real names, many of which were matched to Facebook, LinkedIn or TikTok profiles.

- A month after the identification of these accounts, only 31 of the Instagram seller accounts and 22 of the Twitter ones remained active. However, "in the intervening time, hundreds of new SG-CSAM accounts were created, recreated or activated on both platforms".

- The study notes that while it is possible that some seller accounts are impersonators redistributing content, scammers, or even victims of child exploitation, it seems that most underage sellers are creating and marketing the content on their own accord.

- The monetary transactions take place through CashApp, PayPal, or through gift cards to companies and services such as Amazon, PlayStation Network or DoorDash.

- The majority of sellers mention their age in their profile bios, either explicitly or indirectly, using symbols like emoji or simple equations. Most self-identified as between the ages of 13 and 17, according to their bios.

- While sellers market their content on Instagram and Twitter, the actual content delivery appears to happen on file sharing services such as Dropbox or Mega, after negotiations on DM. "The DM conversations are redacted, screen captured, and subsequently posted to the main account profile as Stories to bolster the authenticity of the seller," the study said.

-The kind of videos which are up for sale include, self-harm videos with and without explicit nudity, advertisements for paid in-person sexual acts (some of which is then recorded and sold to other customers) and imagery of the minor performing sexual acts with animals.

How has Instagram become the platform for pedophiles?

The Stanford report throws light on how the Meta-owned photo and video sharing platform allows the pedophilic content to feature on it without being detected or suppressed.  According to the report, "Instagram is currently the most important platform for these networks with features like recommendation algorithms and direct messaging that help connect buyers and sellers."

The report notes that while many CSAM keywords returned results, search results for some terms returned with a warning text which could be surpassed by clicking on “see results anyway”. Post the search, the Instagram’s user suggestion recommendation system also readily promotes other SG-CSAM accounts to users, thereby, allowing for account discovery without keyword searches. 

Alluding to such Instagram accounts which are between the range of 500 and 1000 at a given time, the report found that they "often have one or no actual posts, but will frequently post stories with content menus, promotions or cross-site links". 

Instagram, which comes within the ambit of Meta's policy rules prohibiting sexualisation of children, advertising of CSAM, sexual conversations with minors and obtaining sexual material from minors, clearly lags behind diligently following the comprehensive rules. "Instagram’s role as the key platform in our investigation is likely not due to a lack of policies, but ineffective enforcement, " the report read.







Tags: