Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Decode

How A Child Rape Survivor Became A Victim Of Disinformation On YouTube

Two years after a 4-year-old was allegedly raped, her family faced another nightmare when a video reached their WhatsApp.

By - Adrija Bose | 23 March 2022 9:19 AM IST

One morning in September 2021, Neelam* woke up to the constant buzzing of her husband's phone. In several office WhatsApp groups that her husband belonged to, a video was shared- the video was of the man who was accused of raping their 6-year-old daughter.

In the 9-minute long video, the accused sits with his family sharing 'their side' of the story and giving out details of the young girl — her name and where she lives.

The video had reached nearly 4,000 members in the seven WhatsApp groups of a government organisation, where Neelam's husband, the father of the 6-year-old, worked.

Neelam was terrified. "They gave out so many details, everyone knew they were talking about our daughter. They wanted to defame my family, and they managed to do it," she said.

Looking through the other YouTube videos that were posted on the same channel, Neelam found out a Kajal Jadon from Gwailor in Madhya Pradesh was the reason why her world came crashing down, once again.

The easy access to technology platforms that can reach thousands instantly with no monitoring in place before a video or an image gets uploaded reveal the fractured nature of the Internet. Time and again, it has been proved that the Internet can be a vulnerable space for children. Neelam's daughter's story explains how the Internet, in this case YouTube, can be used against child sexual assault victims, sometimes even without the use of graphic imageries. 

Two years before this incident, Neelam had filed a complaint after her daughter was allegedly raped. While the case was still on, the accused's family reached out to Jadon, a self proclaimed "men's rights activist" who regularly records and posts videos on various platforms to "fight for men's rights".

"This wasn't the first time she was talking about my daughter, she had done a Facebook live video a year before. But when it reached our own WhatsApp groups is when we found out about it," Neelam said.

She said everyone in her husband's office found out their daughter had allegedly been raped through that video. She was sure they would believe that their family filed a case for "money", like the video claimed.

"They were saying that we used our daughter for money and filed a false rape case. Which mother would do that?" Neelam asked, adding that their lives following the incident had turned 'upside down'. "No amount of money can make up for what we have gone through," she said.

Neelam immediately filed a police complaint against Jadon, the crusader of men's rights activists.

The bigger worry for Neelam was that the video was still available across all social media platforms. She was desperate for it to be taken down. So, she reached out to Mumbai-based Aarambh India, an organisation that describes itself as "India's first online resource centre" on Child Sexual Abuse.

Established in 2014, the Aarambh India Initiative run by RATI Foundation helps people become aware and report cases of Child Sexual Exploitation. In May 2016, in partnership with Internet Watch Foundation, they launched a "hotline" for child sexual abuse images and videos on the Internet.

The "hotline", a reporting button, enables citizens to report child sexual abuse images and videos, anonymously. If the content is found to be illegal, Aarambh India works to take it down.

The video that had brought back nightmares for Neelam's family was on YouTube, distributed over WhatsApp. A year ago, Kajal Jadon did a Facebook live, discussing the details of Neelam's daughter's case. That video was uploaded on YouTube too.

"The Facebook Live was watched by 9,000 people. The accused identified the location of the survivor. It was defamatory in nature," said Uma Subramanian, Co-founder-director of Aarambh India.

They filled out a YouTube form to register a complaint and take down the video. A week later, YouTube sent them an email saying the two videos were blocked from view on the country domain. 

"The videos were taken down so quickly because the mother of the survivor had already filed a police complaint against the woman who had made the video," said Siddharth P of Aarambh India, co-founder and director. The case was registered under the POCSO (The Protection of Children from Sexual Offence) Act because the video consisted of bits that identified the child, he explained.

While the videos can be viewed from countries outside of India, they won't turn up in search— which was something, Siddharth said, the parents were keen on.





The Men's Rights Activists


The group that had recorded and uploaded the two videos 'defaming' Neelam's daughter is called Jwala Shakti Sangathan.

On Facebook, Kajal Jadon, the founder of the group, has over 12,000 followers and on YouTube 2,500 subscribers where she regularly posts videos to "reveal the truth" about "women who trap men". Some of her videos are interviews with men accused of assault and harassment where they reveal details of ongoing cases and at times, identity of the alleged survivors. Some videos are what we have come to call "sting operations" in popular discourse — videos shot without the knowledge of the people being filmed.

Also Read | Hustle And Hatred: The 'Influencer' Life Of An 11-Year-Old Indian Girl

But, Jadon's biggest following is on WhatsApp. It is through these multiple groups that she gets 'invited' to reveal the accused's side of the story through her videos.

The organisation, Jadon said, has at least 10 lakh volunteers. "Men are not always wrong, they are wrongly accused," she told DECODE explaining why she started the group four years ago.

As word spread about Jadon's work on WhatsApp groups through families she had engaged with, she found mention on television shows as well. Recently, a TV news channel covered one of her 'sting operations' to prove a 'false rape allegation' - they used that video and ran it as a news story.

Talking about Neelam's daughter's case, she said that the accused's family reached her and asked her to make a video that 'reveals' their innocence. The accused, Neelam said, has not been arrested so far; the case is still pending before the courts. 

Elaborating how she brings 'justice' to men, Jadon told DECODE when a 'client' reaches her or any of the volunteers in the group the organisation first asks them to provide 'evidence' that proves their innocence. "If there is no evidence as such, I talk to them and to the victim's family to determine the truth," she said. Then Jadon goes on to post two videos: the first video details out the case; the second one usually features an interview with the accused where he gets to present his defense.

Jadon's YouTube channel has a range of videos. In one, she says cases of domestic violence and child sexual abuse are often based on 'lies'. In one video, she proposes a new project for the government- Beta Bachao Andolan.

In a video uploaded in July 2020, she tells her 'live' audience that a person in Prime Minister Narendra Modi's Z-security forces has reached out to her alleging that he has been falsely accused in a POCSO case. "I will fight hard to fight the case. If people like Justice Gogoi and the PM's security are not being left out of false cases, who are we? We are ordinary, so we have to fight," she says in the video.

However, talking to DECODE months after she posted the video, she said that the security person did a razinama, a 'financial compromise' and that's why she didn't take the case up. "We are now dealing with a case of a person who is in the security forces of Uttar Pradesh Chief Minister Yogi Adityanath," she claimed.

Many of these cases that Jadon talks about in her videos are related to minors and are registered under the POCSO Act.

Fighting Online Child Sexual Abuse

Aarambh India's hotline to report child sexual abuse content receives tips of at least 100 URLs every month that they review and then reach out to platforms where they are hosted, to block or delete them.

Once a link is reported, it is then reviewed by the expert team of Internet Watch Foundation analysts based in the UK. If the 'reported content' is found to be criminal, the experts then determine the location from where it was uploaded and being hosted. Next, it gets in touch with the hosting company and relevant law enforcement to initiate the process of taking it off the Internet.

Neelam's is a unique case but not an isolated one.

Most cases that Aarambh India deals with involves child sexual abuse material. In this case, it was about protecting the child's identity. However, Siddharth says that "character assassination" is not uncommon in child sexual abuse cases. "Resistance from alleged accused is something that we witness and manage in most cases," he said. 

The problem of child sexual abuse material on the Internet is systemic —it isn't confined to dark web but hiding in plain sight among content hosted and controlled by popular tech platforms.

Also Read | Indian Matchmaking Sites Are Full Of Fake Profiles Duping People Of Lakhs

The National Centre for Missing & Exploited Children (NCMEC) in the USA released some figures for the reports of online child sexual abuse material (CSAM) that they received in 2019. Turns out, a sobering 1,987,430 pieces of content were reported from India, the highest in the world.

In 2021, the NCMEC's CyberTipline received 29.3 million reports of suspected CSAM, an increase of 35% from 2020.

It didn't take very long for Neelam to take down the YouTube videos related to her daughter, but she is disappointed that the account that made the video hasn't been suspended. "They continue to defame many other survivors through their accounts," she said.

Siddharth P, Co-founder of Aarambh India said that it is easier to take down content when it is hosted by a larger platform. "If the content is hosted on a lesser known platform, we first have to decipher what the country host is. Sometimes, they are registered in Caramon Island in Philippines or some remote location that we don't have access to," he said, explaining the challenges of monitoring child sexual abuse content.

The NCMEC's CyberTipline report from 2021 shows that they received more than 29.3 million reports on online exploitation of children. Out of this, 29.1 million of these reports were from Electronic Service Providers that report instances of apparent CSAM that they become aware of on their systems. The breakdown shows 22,118,952 content pieces reported were on Facebook, 3,393,654 were reported on Instagram, and 1,372,696 pieces were on WhatsApp.

While this particular report doesn't mention YouTube, the platform has been criticised multiple times for allowing content related to child sexual abuse. After an investigation by Wired and video blogger Matt Watson in 2019 that alleged that pedophiles were using YouTube's comments section to leave predatory messages and share links to CSAM, the video platform disabled comments on videos containing young children. But, that clearly hasn't put an end to the problem.

In the first quarter of 2022, YouTube has removed over 1,182,403 videos for violating their child safety policies.

"No form of content that endangers minors is acceptable to us. We have explicit policies that prohibit this, and we take an extra cautious approach towards enforcement," a spokesperson for YouTube told DECODE.

"Additionally, we disabled comments and limited recommendations on hundreds of millions of videos containing minors in risky situations and restricted live features to disallow younger minors from live streaming without an adult. We've invested heavily in this area, and will continue to work with experts inside and outside of YouTube to provide minors and families the best protections possible," the spokesperson added.

Over the last few years, YouTube has tried a number of enforcement tactics. The second largest search engine, right after Google, said that they have made regular improvements to machine learning classifiers to identify content that hurt minors and introduced mechanisms to control commenting on videos and livestreams.

Among other features, YouTube's flagging feature is for anyone to submit a video for careful review by the staff and a feature to request to have content removed if it's your personal information without your consent.

"We remove violative playlists when we're made aware of them and continue our work to identify where we can leverage our technologies to enforce this policy at scale," the spokesperson at YouTube further added.

What this means is that while big platforms are potentially hosting CSAM, a bulk of reporting of this content is now the responsibility of these hotlines that have emerged across the world.

In a NCMEC report where 52 hotlines located in 48 countries participated, the administrators agreed that while it's common for some leaders of industry to proactively take steps to eliminate CSAM on their servers, in other cases, platforms are not aware of how their system can be exploited or how they can take proactive steps to combat this exploitation.

Also Read | Are Indian Kids Really Learning 'Coding' Like Ed-Tech Companies Claim?

Most of the hotlines are heavily reliant on manual operations, in effect possibly tracking and reporting only a small fraction of CSAM that is available on the Internet.

The participants also said that the international response to child sexual abuse online is fractured. "Their (hotlines) missions and global contributions vary based on each country's policies, and international collaboration is frequently hindered by these differences. As a result, there can be duplication of effort and slower investigations," the report noted.

While taking something off the Internet sounds like a challenging task, it is possible. But, the impact one image, or a video can have on people's minds can be permanent. That's the fear that Neelam has even after the videos on her child were taken down.

"The videos were taken down from YouTube but so many people had already watched, people who knew us, our neighbours and office colleagues— everyone had seen it by then," she said.

Neelam is now fighting two cases- one on the alleged rape of her daughter and the other against the group that made the video. She said that all social media accounts of the woman who had shot the videos should be deleted. "She destroyed our lives".

Jadon, however, disagrees. "I have been the support that men needed," she said.


*The name has been changed to protect her identity. 


Note: If you stumble across sexually explicit images or videos featuring children on the Internet, you can report it to Aarambh India by filling out an online form here. If you'd like to report something other than an online image or video of child sexual abuse, you can call the government child helpline at 1098. 

Tags: