BOOM’s analysis of all published fact-checks showed that AI-related misinformation peaked in 2024, marking a significant rise compared to previous years. From fabricated images and videos of celebrities promoting health products and financial services, to voice clones being shared as real voices of politicians, AI-driven misinformation was used to create false narratives throughout the year.
Between January 2 and December 31, 2024, BOOM published 1,291 fact-checks, in English, Hindi and Bangla (this figure excludes the translated stories in Hindi and Bangla from English). Of these, 8.35% or 108 fact-checks were AI-related fabricated misinformation. This is an increase of more than 11% compared to 2023, which recorded 97 fact-checks.
Overall, of the 1,291 fact-checks, the Lok Sabha elections 2024 was the main topic of mis/disinformation, followed by Islamophobia and the Bangladesh protests that triggered an unrest in July 2024, among others.
The Muslim community in India was the top target of mis/disinformation for the fourth consecutive year. Of the total fact-checks, 13.3% or 172 fact-checks had false or misleading claims targeting the community.
However, this is a marginal decrease from 2023’s 183 fact-checks targeting Muslims.
BOOM found that of the total published fact-checks, 40.7% or 526 fact-checks had old and outdated claims that were circulated as recent.
Over 45%, or 584 fact-checks involved verified social media accounts spreading mis/disinformation. Additionally, BOOM compiled a list of verified accounts spewing false information. Of these, the X account of Jitendra Pratap Singh repeatedly peddled mis/disinformation throughout the year. BOOM debunked 17 false claims shared by Singh. The official national and state X handles and those associated with the Bharatiya Janata Party (BJP) came next. The ruling party and their functionaries shared false claims in 15 fact-checks.
Other X accounts that spread mis/disinformation included that of BJP leader Amit Malviya (13), We the People (13), the official account of the Indian National Congress (11), Megh Updates (10), Baba Banaras (9), Amitabh Chaudhary (7), Rishi Bagree (7), Kreately (6), among others.
AI-driven misinformation: a growing menace
Of the total fact-checks we published, 108 were AI-related. Of these, 39 fact-checks included AI fabricated images, 41 voice clones and 28 deepfake videos and audios.
In 2024, AI-driven fake investment scams, often using voice clones and deepfake videos, topped the list with 14 fact-checks. This was followed by misinformation related to the Lok Sabha elections and fake health scams, each with 12 fact-checks.
Fake Investment scams
Voice clones and deepfakes of politicians such as Finance Minister Nirmala Sitharaman, former Prime Minister Manmohan Singh and businesspeople such as Reliance’s Mukesh Ambani, Adani Group’s Gautam Adani, Infosys’ Narayan Murty, Google CEO Sunder Pichai were spread throughout 2024.
For instance, a deepfake video of Nirmala Sitharaman announcing a partnership between the Indian government, and a cryptocurrency trading platform named Quantum Trade, was shared on Facebook. However, BOOM found that the video was created using voice cloning technology. Further, there was no credible evidence of Sitharaman making any announcements about such a partnership.
We debunked similar claims of a deepfake of Narayana Murthy promoting a quantum AI platform, Sundar Pichai promoting a fictitious google investment, and other fake investment opportunities using deepfakes and voice clones of Manmohan Singh, Gautam Adani and Sudha Murthy promoting it.
AI Misinformation Around Lok Sabha elections 2024
Politicians like Rahul Gandhi, PM Modi, and Trinamool Congress chief Mamata Banerjee, along with Bollywood actors such as Ranveer Singh and Aamir Khan, were targeted using deepfakes and voice clones. These AI-driven tools were used to create fabricated content, either to criticize political leaders or their opponents, with the intent of influencing public opinion.
A viral AI-generated phone conversation between AAP Rajya Sabha MP Swati Maliwal and YouTuber Dhruv Rathee was shared on social media as a real conversation between the two. In the clip, Maliwal can be heard explaining to Rathee how she was assaulted in front of Delhi CM Arvind Kejriwal and his wife Sunita Kejriwal, and also requesting Rathee not to make a video on it.
With the help of two AI detection tools, BOOM found that the audio was generated using AI, and was not a genuine conversation between Maliwal and Rathee.
AI Misinformation Around Diabetes
From voice clones of Bollywood celebrities such as Shah Rukh Khan and Amitabh Bachchan promoting diabetes pills to a deepfake of news anchor Sudhir Chaudhary explaining the treatment for erectile dysfunction, we repeatedly came across fake endorsements of diabetes cures and arthritis treatments promoted by celebrities, TV news anchors and politicians.
In most cases, these were sponsored videos on Facebook showing celebrities or politicians endorsing a fake diabetes medicine, claiming cures. These videos were created by cropping original videos of celebrities or politicians and were then overlaid with AI voice clones.
AI Misinformation: Who was the most targeted public figure?
Virat Kohli was the most targeted public figure under AI misinformation in 2024, with six fact-checks debunking deepfake videos and edited images. These included an AI generated image of him at the Ram Temple inauguration ceremony, endorsements for casino and betting apps, and a fabricated video of him demanding the death penalty in the RG Kar Medical college case.
Mukesh Ambani and news anchor Rajat Sharma were also among the notable figures targeted by AI misinformation.
Lok Sabha Elections 2024
India’s national elections were held in seven phases between April 19 and June 1, 2024. The internet was rife with mis/disinformation during this time. BOOM published 307 fact-checks related to the elections. Nearly 50 of these fact-checks were published well after the elections, as false claims continued to circulate around election-related topics.
24 out of the total fact-checks involved false information aimed at Congress leader Rahul Gandhi, making him the top target of mis/disinformation during the election cycle, our analysis showed. Of the 24 fact-checks involving Gandhi, 18 were smear campaigns against him.
PM Modi was the focus of 19 fact-checks. However, Modi was both a target and a source of misinformation, spreading false information in at least three instances, all with communal undertones. These include, misquoting former PM Manmohan Singh, spreading misleading information about the Congress manifesto, and denying that he mentioned 'Muslims' while referring to them as 'infiltrators' and 'those with more children' in his election speech.
Read our comprehensive national election analysis of 258 fact-checks published on June 3, 2024.
Assembly Elections 2024
Maharashtra, Haryana, Jharkhand and Jammu and Kashmir went to polls during the latter half of 2024. BOOM published 40 fact-checks around the state Assembly elections.
6 out of the total 40 fact-checks involved false information aimed at Shiv Sena UBT’s Uddhav Bal Thackeray followed by claims targeting Shiv Sena UBT’s Sanjay Raut (4) and the Congress party (4).
For instance, a cropped video falsely claimed Thackeray admitted to eating beef at a public rally and credited Muslim voters for his party's win in the Lok Sabha elections. However, BOOM found that the video was edited. In the original footage, Thackeray was criticising the BJP and referencing Union Minister Kiren Rijiju's 2015 statement on eating beef.
BOOM also analysed the type, purpose or intent of the false claim with which it was shared. Our definitions for each have been adopted from Claire Wardle’s classification of different types and intentions of mis/disinformation.
The majority of misinformation that targeted Thackeray, Raut and the Congress party consisted of ‘false content’ (false connections and false contexts), followed by ‘manipulated content’ (alteration of photos/videos in such a way that makes them appear realistic but changes the overall meaning of the original content).
Bangladesh unrest
In July 2024, the Bangladesh protests which started as a peaceful student-led movement against the Supreme Court’s decision on public sector job quotas, escalated into deadly violent clashes between the government and the protestors, leading to around 1,500 deaths.
During this time, social media was rife with communal disinformation. BOOM published 84 fact-checks related to the unrest. Of these 37 fact-checks dealt with claims that were communal in nature.
Who was the top target and how were they targeted?
Bangladeshi Muslims were the most targeted community during the unrest with claims in 35 out of 84 fact-checks targeting them.
Some claims did not directly target Bangladeshi Muslims but were aimed solely at the protestors, resulting in 9 fact-checks
The next most targeted was former Prime Minister Sheikh Hasina, with 6 fact-checks debunking false claims about her.
Misleading claims and old or unrelated violent videos and images were repurposed or manipulated to falsely depict attacks on Bangladeshi Hindus by Muslims.
Videos of assaults on women and students were falsely spread with communal undertones. For instance, a video showing a group of men and women assaulting another woman was shared with the claim that she was being attacked for being Hindu in Bangladesh. But we found that the video was from Brahmanbaria, Bangladesh, a district near the Tripura border, and the woman being attacked belonged to the Muslim community.
Further, our analysis showed that clips and photos from other countries, such as Myanmar or India, were falsely labeled as anti-Hindu violence in Bangladesh.
A viral graphic video allegedly showing dead children in a house was shared with claims that Hindus in Bangladesh were attacked and killed in their own homes. The claim included a hashtag labelled "#SaveBangladeshiHindus." However, the viral video was originally taken in the Rakhine State of Myanmar in June 2024 and was not from Bangladesh.
It is important to note that while there were genuine reports of minorities being attacked during the unrest, some right-leaning accounts in India exaggerated the situation by falsely claiming that most incidents of violence were targeted specifically at Hindus.
Claims directed at Sheikh Hasina included misrepresentations of her public appearances, such as an old photo of her boarding a metro after inaugurating Bangladesh's first metro service. This image was falsely circulated as showing her in an Indian metro after fleeing to India following her resignation as Bangladesh's prime minister.
Similarly, an image of Hasina sitting at a table and being welcomed with food was falsely claimed to be from India. However, the photo was taken in March 2019 during her visit to the Kumudini family in Mirzapur, Bangladesh.