The year 2024 was eventful, marked by significant developments, including the Lok Sabha elections in India, the Maharashtra and Haryana assembly elections, the fall of Sheikh Hasina’s government in Bangladesh, and the US presidential election which saw the return of Donald Trump to power.
BOOM debunked several pieces of misinformation surrounding these events, including 309 fact-checks related to the Indian general election, 99 tackling AI-driven misinformation, and 69 focused around the Bangladesh crisis.
Below are the top 10 events that had a significant impact and were surrounded by widespread misinformation.
1) 2024 Lok Sabha election
The Lok Sabha elections were conducted in seven phases between April 19 and June 1 this year, with the results announced on June 4. Although the Bharatiya Janata Party (BJP) fell short of securing a majority on its own, the National Democratic Alliance government was formed with support from Nitish Kumar’s Janata Dal United and Chandrababu Naidu’s Telugu Desam Party.
This year, BOOM has debunked approximately 42 instances of misinformation related to Prime Minister Narendra Modi and 56 related to Leader of Opposition Rahul Gandhi.
Several old and unrelated videos were shared with misleading claims during the Lok Sabha campaign, including a doctored video of Home Minister Amit Shah. The video was shared falsely claiming he promised to end reservations for Scheduled Castes (SCs), Scheduled Tribes (STs), and Other Backward Classes (OBCs) if the BJP returned to power.
BOOM found that in the original video Shah promised to scrap Muslim reservation in Telangana during the 2023 assembly election campaign.
2) AI related misinformation with communal spin
BOOM also debunked several instances of misinformation this year involving AI-generated content, including images, voice clones, and deepfakes, used to deceive and scam people.
A Decode investigation found Meta AI's text-to-image feature is being weaponised to create problematic anti-Muslim imagery in India.
Several videos of Bollywood celebrities including Aamir Khan and Ranveer Singh were also shared with fake AI voice clones to falsely claim they were criticising PM Modi in the run up to the Lok Sabha elections. Multiple fake ads using AI-voice clones of prominent personalities have used to perpetrate scams on Meta's platforms.
3) Bangladesh crisis
Protests against Sheikh Hasina’s government in Bangladesh began as a response to a controversial quota system that reserved 30% of government jobs for families of war veterans. These protests soon escalated into widespread demonstrations against the Hasina government, resulting in around 400 deaths, including several student protesters. This was followed by Hasina fleeing the country on August 5, 2024.
Political unrest took hold, with reports of attacks on members of Hasina’s Awami League party and its allies emerging, along with incidents of violence targeting police officers and minority groups.
Several videos from this period of violence, along with some older unrelated videos went viral with false claims. For example, a video of an attack on the Jatrabari police station in Dhaka by protesters went viral falsely claiming that Hindu homes had been set on fire by a mob. Similarly, another video showing a restaurant burning in Bangladesh’s Satkhira district was shared with the misleading claim that it showed an attack on a Hindu temple.
After the fall of the Hassina government, an interim government was appointed under the leadership of Nobel Prize winning economist Mohammad Yunus. According to the data released by the interim government in Bangladesh at the end of August, more than 1000 people were killed in the violence that started in the country in July.
4) Misinformation around 2024 assembly elections
Post the Lok Sabha election this year, there were assembly elections in four states including Maharashtra, Haryana, Jharkhand and the Union territory of Jammu and Kashmir. BOOM debunked several pieces of misinformation around the Maharashtra and Haryana election.
A fake opinion poll went viral, falsely claiming it showed the Congress Party leading in the polls. Additionally, an unrelated video from a Milad-un-Nabi celebration in Latur, Maharashtra, falsely claiming it was a Congress rally in Haryana.
A day before voting in the Maharashtra assembly election, the BJP posted at least three fake AI generated audio clips on their official X handle. These clips, shared late on November 19, claimed to be recorded conversations involving opposition Maha Vikas Aghadi (MVA) leaders Supriya Sule (NCP), Nana Patole (Congress), IPS officer Amitabh Gupta, and an employee of an audit firm, Gaurav Mehta.
BOOM analysed these clips and found that three out of four audio clips tested were found to be AI-generated. One of the audio clips which was found to have little evidence of manipulation was too short to be accurately tested.
This was the first instance of AI voice clones shared by a political party handle to spread disinformation before an important state election in India.
5) RG Kar medical college and hospital rape case
On August 9, 2024, a 31-year-old trainee doctor was raped and murdered while on night shift duty at Kolkata's RG Kar Medical College and Hospital. The victim, a second-year MD student in the Chest Medicine department, was found dead in the hospital's seminar hall.
This tragic crime ignited nationwide protests from the medical community and civil society groups, demanding improved safety for women in India. The incident also fueled a wave of false and misleading claims, along with conspiracy theories, on social media.
BOOM debunked around 28 pieces of misinformation that went viral around this case. ‘Reclaim the Night’ protests were being held in various parts of Bengal, including Kolkata, late on the night of August 14, 2024, protesting the incident.
On social media, a false claim went viral that a student from Bardhaman University was brutally murdered while returning from the protest. However, Bardhaman Police had confirmed to BOOM that this was false and no such complaint had been filed.
6) US election 2024
BOOM has debunked several pieces of misinformation around the US election. A photo of a man and woman photographed with United States Vice President Kamala Harris at a fundraising gala by an NGO in 2016 was being shared on social media with a false claim that both of Harris's parents are of Indian origin and that she is misleading people about her ethnicity by pretending to be Black.
BOOM had found that the people in the viral photo with Harris were not her parents, but two attendees at a fundraising gala by an NGO.
7) Manipur violence
In 2024, violence continued in Manipur, with numerous instances of clashes reported throughout the year. Misinformation also went viral amid the ongoing conflict. BOOM debunked a disturbing video, originally showing a girl being molested by a group of men in Andhra Pradesh, that resurfaced online with a false communal claim that it showed a Hindu man attempting to rape a Christian Kuki girl in Manipur.
BOOM also fact-checked a video featuring AI-generated visuals, falsely claiming to show massive rallies in Manipur advocating for peace and an end to the ethnic clashes.
8) Farmers' protests 2024
In February 2024, thousands of protesting farmers from neighboring states, including Punjab, Haryana, and Uttar Pradesh, were marching towards Delhi with their tractors, demanding assured prices for their crops and the implementation of a legally guaranteed Minimum Support Price (MSP) by the central government.
During this time, BOOM debunked several pieces of misinformation surrounding the protests. One example was an unrelated video of a modified tractor, shared falsely claiming to show farmers using it en route to Delhi to breach police barricades.
9) Ayodhya Ram Temple
In January 2024, misinformation surrounding the consecration ceremony of the Ram Mandir in Ayodhya went viral on social media. BOOM debunked several pieces of misinformation related to the event during that month.
One viral claim falsely alleged that the new Ram Janmabhoomi Temple in Ayodhya was being constructed 3 kilometers away from the site of the Babri Masjid. BOOM fact-checked these claims by speaking to reporters who had visited the temple complex and confirmed that the temple is being built directly above the ruins.
10) Gangster Lawrence Bishnoi
Gangster Lawrence Bishnoi's name came up in the media after National Congress Party (Ajit Pawar faction) leader Baba Siddique was shot and killed in Bandra’s Nirmal Nagar, Mumbai, in October 2024. The suspects arrested in the case were reportedly linked to Bishnoi’s gang.
Bishnoi has been jailed since August 2023 in the high-security ward of Sabarmati Central Jail in Ahmedabad, Gujarat.
A video of a man speaking to the media while in police custody was widely shared on social media with the false claim that it showed the shooter of Baba Siddique addressing a press conference after his arrest.
Bishnoi had also made headlines in 2018 when he threatened to kill Bollywood actor Salman Khan over the blackbuck poaching case. Amidst rumors of fresh death threats to Salman Khan, an old video of actor Vivek Oberoi praising women from the Bishnoi community resurfaced online as recent.