Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Decode

AI Face-Swap Apps Used to Create Videos Of Kolkata Doctor

The finding highlights a disturbing phenomena where the photos of a sexual assault and murder victim in India have been edited using generative AI.

By - Karen Rebelo | 30 Aug 2024 1:02 PM IST

Photos of a trainee doctor, who was raped and murdered at a Kolkata hospital earlier in August, have been edited using an artificial intelligence (AI) face-swap app to make viral YouTube Shorts and Facebook reels, an investigation by Decode found.

Several videos swapped the doctor’s face onto template clips of Indian women modelling, using the AI photo editing app Photo Lab, which allows users to swap faces in default templates with their own photos. It’s unclear who created the videos.

The finding highlights a disturbing phenomena where the photos of a sexual assault and murder victim in India have been edited using generative AI.

“It’s extremely problematic. Because it’s one thing to use the image of the deceased victim for a larger discourse around ‘justice for (victim’s name)’,” Uma Subramanian Co-Founder & Director of Rati Foundation told Decode.

The foundation works onground and online with children and women affected by sexual violence.

“But we have come to the point where people are taking her image and creating videos that are absolutely not connected even remotely to anything related to justice or any meaningful discourse. People are doing it just because they can do so without any implications. It’s a gross violation of the privacy and dignity of the victim,” Subramanian added.

Decode reached out to the Indian Institute of Technology Jodhpur, whose in-house deepfake detection tool named Itisaar, confirmed that the clips were altered and fake.

These AI videos were also generating viral engagement for users who uploaded them.

A Facebook reel with one of the videos was played 2.2 million times and got 55,000 Likes. 

One of YouTube channels based out of West Bengal, which regularly posts AI generated videos but barely gets any views, had uploaded five Photo Lab videos using the victim’s photo.

One of the videos had 7,45,000 views while another had 2,06,000 views and counting at the time of publishing.

Rohini Lakshané, an independent researcher and technologist, told Decode, that the primary motivation is often money. "This incident is in the news cycle. Channel owners know people are searching for these videos and want to ride the news wave," she explained.

Decode also found other images of the deceased Kolkata doctor that have been edited using AI photo editing apps.

The original photos have been stolen by social media users from the victim’s Instagram account which was still online at the time of writing this article. Instagram users have peddled conspiracy theories about her murder in the comment section of many of the posts.

Photo Lab: From Air Brushing Photos To Air Brushing Reality

Decode was able to trace the videos back to Photo Lab as we found the same template of short motion clips used by other users of the app. The short videos showed a young girl in a black saree twirling, a girl walking down a street, a girl in a blue salwar climbing up the stairs etc. The videos are barely four seconds long, have soft lighting and vibrant colours that look like a high-quality production even if at times the face looks unreal.




 




The AI editing photo editor app by San Francisco based Linerock Investment Ltd, is popular among Indian Internet users who upload their images and edit them using the various filters the app has on offer. A number of Indian women, across age groups, have uploaded reels to Instagram created with Photo Lab. Similarly, tutorial videos in Indian languages on YouTube teach others how to use the app.


YouTube search results for Photo Lab 

 


The AI Intersection

The AI edited images and videos are not the only synthetic content shared by social media users in connection to the Kolkata rape and murder case.

Some of the videos, both non-AI and AI, are also being produced by social media users based in Pakistan and Bangladesh.

Decode also found a few AI generated songs on YouTube seeking justice for the deceased doctor.

One such synthetic song is an angry Bengali rap likening the doctor to the goddess Durga, made by a YouTube channel operated out of Bangladesh. The thumbnail of the video shows an image of the doctor made with AI.

“I recently saw the news from Kolkata. I, from Bangladesh, want exemplary justice for this incident. This song is just a protest,” Najmus Sakib, Managing Director of Behind 5, told Decode.

“Leonardo AI was used for the song's thumbnail and Suno AI for the song itself. All content on this channel is created using AI, and the upload time is also determined by AI,” Sakib added.

The most common use case of generative AI in the discourse about the Kolkata case has been AI generated images used as thumbnails for YouTube videos by content creators.

Most of the AI depictions have been problematic showing a female doctor with a torn or blood stained white lab coat and sporting bruises.

The images almost seem like regurgitated AI versions of the troubling digital illustrations and representational images used by mainstream news outlets to report about sexual assault cases in India.

YouTubers have also wasted no time to create videos entirely made with AI generated images and AI voiceovers narrating details of the incident giving an AI spin to the true-crime genre.

Anonymity Of Survivors Illusive In Social Media Era

Since the crime at RG Kar Medical College on August 9, 2024, calls for justice have filled both online and offline spaces. However, the victim’s identity has been widely revealed, despite Indian laws prohibiting the disclosure of rape victims' identities to protect them and their families from stigma.

Disturbing photos of the victim’s body found at the crime scene, photos of her funeral, videos of her parents speaking to reporters, hashtags seeking justice using her name and images stolen from her Instagram account have been plastered all over social media.

The Calcutta High Court and India’s Supreme Court, which took suo moto cognizance of the case, had to intervene and have asked social media platforms to remove content revealing her identity.

Yet social media platforms such as X, YouTube, Instagram and Facebook have been slow to moderate and take down such content.

ALSO READ: Kolkata Doctor Rape-Murder: Viral Social Media Posts Defy Court Orders

“There is a larger reason as to why revealing the identity of the survivor or the family or the location is limited by the law and should be the norm,” Rati Foundation’s Uma Subramanian said.

However, Subramanian stressed the need for a nuanced discussion about the agency of adult survivors and, if deceased, their families.

Calls For Justice Veer Into Voyeurism

Decode had earlier reported how online searches using the victim’s name and the keyword ‘porn’ or search terms ‘‘rape video’ had surged in the week after the crime.

Social media influencers and make-up artists also created reels pretending to be the deceased doctor where they ‘re-enacted’ the crime.

Several X users had complained about an Instagram account that had stolen photos of the doctor and had edited them. The bio of the fake account created in the doctor’s name, used ‘rape victim’ in its description. The Instagram account has since been taken down.

Researcher Rohini Lakshané said she came across several posts on X sharing disturbing photos of the Kolkata doctor’s body at the crime scene despite the Elon-Musk owned platform having a policy on deceased individuals.

Lakshané said a quest for engagement, credibility and validation within social circles could be other incentives for posting such content when money is not the main objective.

“It will also get them views and followers and a certain kind of social validation and credibility, which is another thing that I have seen with privacy violating content that’s been circulated non consensually which includes everything from non-consensual images, rape videos etc. where people who don’t seem to have any other motivation like revenge or money are also sharing it because it gets them reputation within that community,” Lakshané said.

“While you and I might find it cringe there is a certain “brotherhood” where it will get them social validation and credibility” Lakshané added.

AI Has Weaponised Gendered Online Abuse

The use of artificial intelligence to edit the image of a sexual abuse victim is an ethical minefield.

The rise of nudify apps and apps that enable anyone to synthetically ‘strip’ a person’s photo have shown how generative AI is weaponised to create gendered online abuse mainly targeting women.

Fears that generative AI might be misused to create synthetic non-consensual imagery are not unfounded, especially in cases where sexual assault survivors might not receive the same outpouring of sympathy.

ALSO READ: X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

“Of course it is going to be retraumatising and devastating for the victim, and in case the victim has passed away, then for their near ones. Once the genie is out of the bottle, once the video is viral on the Internet, it’s almost impossible for the ordinary victim-survivor to permanently get it removed from everywhere on the Internet,” Lakshané said.

“So now you don't just have to deal with the original content you also have to deal with clickbaits where it is not them, where it is somebody else or you deal with fake depictions using AI and so on. So for the victim it’s of course going to be retraumatising. Many of these victims consider suicide,” she added.

Rati Foundation’s Uma Subramanian said more proactive measures and a zero tolerance approach to sexual abuse were needed.

“I feel the Kolkata rape is the worst form of abuse and it has happened yet again. But there are multiple instances of abuse online, AI generated and otherwise that come to light on an everyday basis,” Subramanian said.

“The wrestlers’ protest is an example that happened on-ground one year ago. The women wrestlers were also demanding safety isn't it? But we did not really push for the kind of changes that we are asking for today, did we?” she asked.

ALSO READ: Photo Of Wrestler Vinesh Phogat Smiling In Police Van Is Morphed

“The larger question is are we as a country going to pay attention to this only if a gruesome case like this happens? Or are we going to take proactive measures and say there has to be zero tolerance when it comes to women not feeling safe online or on-ground,” she added.

“We are not listening to survivors or victims on a regular basis,” she said. “Real change will come when we commit to doing so.”

Decode reached out to YouTube and Linerock Investment Ltd for a comment. The article will be updated upon receiving a reply.


With inputs from Nivedita Niranjankumar

Tags: