Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
BOOM Research

Pallywood, AI, Graphic Violence Breed Fake News On Israel-Palestine: BOOM Study

Between October 7 and December 22, BOOM fact-checked 100 false claims around the war. This is our analysis of these claims.

By -  Archis Chowdhury | By -  Nidhi Jacob |

27 Dec 2023 12:44 PM IST

The current spate of violence between Israel and Palestine precipitated one of the biggest fake news cycles in recent times, often marked by disturbing images and videos of graphic violence. BOOM fact-checked 100 false claims between October 7 to December 22, 2023, out of which 24 were found to be made with unrelated graphic videos and images, showing anything between scenes of beheading of children, to execution of prisoners, and falsely linking them to the violence.

Remarkably, verified users on X (formerly Twitter), mostly from India, were found to consistently lead the charge on spreading false and misleading claims around the conflict, in an attempt to gain engagement while severely distorting global perception of the crisis. 65 per cent of all false and misleading claims we fact-checked were propagated by 104 verified X users, out of which 42% were Indian accounts.

Additionally, we found that the predominant theme among false or misleading claims favouring Israel were often centered around false allegations of Palestinians faking their deaths or injuries, a trend termed as the 'Pallywood' campaign : sinister portmanteau of Palestine and Hollywood aimed to prove via images that Palestinians faked their injuries, suffering and even deaths.

On the other hand, false or misleading claims favouring Palestine were aimed at sensationalising the conflict and using images from other conflict struck regions to reflect the conditions of the Palestinians in Gaza.

Israel and the region of Palestine have been at conflict historically but the same took a fresh turn on October 7, when Islamist militant group Hamas launched an offensive on Israel, killing nearly 1,140 people, and taking over 200 hostages. In retaliation, the Israeli government initiated thousands of airstrikes on Gaza, resulting in the loss of over 20,000 Palestinian lives, predominantly civilians, including 8,000 children. This has marked it as one of the most devastating conflicts in the region to date.

The simultaneous mis and disinformation campaign appeared to match the intensity of the actual war, flooding social media with old footage and images, which included shocking and violent content. From misleading and false claims around ‘fake casualties' to fabricated video game footage depicting the war, BOOM has fact-checked a total of 100 false or misleading claims around the war.

Our analysis of these claims sheds light on the narratives being constructed, and on the methods of manipulation of perception.

Here are the six key takeaways from BOOM's study

1. Sensationalist claims: We found widespread use of sensational language to provoke strong emotion. 27% of the claims we studied were found to exclusively sensationalise the war using false/misleading information.

2. Misinformation on both sides: While false/misleading claims supporting Israel were dominant accounting for 44%, those supporting Palestine followed closely at 38%.

3. Verified handles lead the charge: Verified accounts on X (formerly Twitter) were found to spearhead the mis/disinformation campaign around the war. 64% of all the claims involved at least one verified account. Furthermore, 13% of these claims were shared by official handles linked to a government or authority.

Full View

4. Recycling old content: 56% of all the claims we studied were found to be made using old and out of context videos, and falsely linking them to the violence.

5. Medium of deception: Videos were the dominant medium used for sharing false/misleading information, accounting for 84% of all the claims we studied. Only 13% were accompanied by images, while a mere 3% were text-only.

Full View


6. Video games, AI & Deepfake: 8% of the claims were found to falsely shared video game footage (mostly from combat simulators) as real. Furthermore, 4% of the claims were made using images that were entirely fabricated through artificial intelligence, while 2% of the claims were made using deepfakes (videos edited using artificial intelligence).

7. Use of graphic violence: 24 per cent of the 100 claims we fact-checked were made through the use of images and videos showing graphic violence. This includes scenes of beheading of children, and execution of prisoners.

Did the claims favour Israel or Palestine?

For the purpose of the study, BOOM identified two different sides of the conflict which featured heavily in the claims and captions we fact-checked. A thorough analysis of these claims confirmed that most of these claims either favoured Palestine, or favoured Israel.

We sorted these claims based on the side they favoured, and found that 44 of them favoured Israel, while 38 favoured Palestine. We found one claim to support militant group Hamas directly, while 17 claims were found to be neutral.

This reveals that while claims favouring Israel were more prevalent, mis/disinformation favouring Palestine was not far behind.

Full View

Additionally, we found one claim to be shared by social media posts under two different narratives, one favouring Israel, and another pro Palestine.

A video showing football fans celebrating with fireworks in Algiers, Algeria, was falsely shared as a footage of airstrikes in Gaza. One user shared this false claim while supporting Israel's actions in retaliation to the October 7 offensive by Hamas, while another user shared this claim with the old video to call out Israel for its indiscriminate attacks on Gazans.

BOOM also found that 17 of these false/misleading claims were found to take a neutral stance, not appearing to favour one side over another.

Breaking Down Pro-Palestine and Pro-Israel Claims

We further broke down the claims according to the side they supported, and analysed the themes and topics of the claims.

Our analysis reveals that the most prevalent theme or topic of claims favouring Palestine was 'sensationalism' - claims that attempted to sensationalise the conflict. Another prominent theme for claims favouring Palestine was 'International', where individuals, government officials and other entities from around the world were falsely linked to the conflict.

On the other hand, the most prevalent topic for claims favouring Israel was found to be 'Faking deaths/injuries' where the claims aimed to raise doubt on the condition and injuries suffered by Palestinians. Some of the claims even falsely claimed that Palestinians were "acting dead" for videos and lying about the number of casualties.


Full View


We further performed a word cloud of the captions for the claims favouring each side. We removed closely related words which high probability of recurrences, like Palestine, Israel, Hamas, and Gaza, in order to better understand how these entities were addressed.

Words like 'terrorists', 'islamic', and 'baby' were found to be recurring in claims favouring Israel, while 'children', 'bombing' and 'war' were found  in claims favouring Palestine.

Full View

This further reveals the prevalence of false claims regarding 'children' throughout the course of the ongoing conflict, targeting both the sides.

Recurring patterns of mis/disinformation around the war

"Pallywood": Staging Deaths/Injuries

BOOM fact-checked 15 claims that falsely claimed that the ongoing violence is being staged. Out of them, 14 of the claims targeted the Palestinians, and falsely claimed that they are faking their deaths and injuries.

Some of these posts showed unrelated behind-the-scenes footage of film shootings, and falsely connected them to the ongoing violence, while some others shared real images or videos of dead bodies, or injured individuals, and falsely portrayed them as staged.

On the other hand, one false claim falsely portrayed the behind-the-scenes footage of a film shooting as Israelis faking the deaths of children from Hamas' attack. Interestingly, the same footage was also shared with the claim that it showed Palestinians faking the deaths of children from Israeli attacks.

We fact-checked two different claims that shared real videos showing actual dead bodies of young children and infants, and falsely portrayed them as plastic dolls being used by Palestinians to fake deaths of children.

Full View

For example, BOOM debunked two videos showing a Palestinian woman grieving over her 5-month-old baby, Muhammad Hani Al-Zahar, who lost his life in a Gaza airstrike. These videos were claiming that it was not a baby but a plastic doll. Moreover, several verified X handles, including the Israeli news outlet Jerusalem Post, amplified this debunked claim.

Another video showed the dead body a four-year-old child from Gaza named Omar Bilal Al-Banna, who died on October 12, 2023, after his neighbourhood, Al-Zaytoun, was struck by Israeli airstrikes. Many Indian verified handles, and official handles linked to the Israeli government, shared the video falsely claiming it showed a doll and not a real baby.

Recycling Old Videos

Out of the 100 false claims we debunked, more than half i.e. 56%, involved old videos that were repurposed, depicting events unrelated to the Israel-Hamas war. For example, BOOM had fact-checked a disturbing video of a group of Syrian rebels decapitating a young boy near Aleppo, Syria in 2016. The video had resurfaced with claims falsely linking it to the Israel-Hamas war.

Likewise, an old video showing a traditional ritual from Indonesia was falsely shared as deceased Palestinians who became victims of the ongoing Israel-Hamas war. The brief 11-second video showed several individuals lying on the ground, draped in white cloth bearing mystical symbols. BOOM found that video is originally a traditional Balinese practice called Calonarang, that was held in Indonesia in October, 2022.

Video Game Footage

We verified at least 8 video games footage which were circulated as real-world events. The fabricated footage was mainly taken from a tactical shooter simulation video game called ARMA 3 and was pushed out as actual instances from the war.

Our fact-checks included videos showing helicopters being shot down by missiles which were falsely shared as Hamas shooting down Israeli helicopters. Another video game footage was peddled as Israel’s counterattack against Hamas. 

Full View

Artificial Intelligence-Generated Content

Of the 100 claims, we debunked 4 AI generated and 2 deepfake videos around the war.

A viral image of a father walking with 5 children amidst a back-drop of bomb-destroyed buildings was widely circulated with a fake claim that it was from Gaza. BOOM found that the image was AI-generated after running the image on AI detector tools such as AIOrNot.com and Hive Moderator. Similarly, BOOM fact-checked another AI image showing fans of Spanish football club Atletico Madrid supporting Palestine. The AI-generated image was created by a fan page of the club.

Full View

Manipulated & misleading videos featuring officials from various countries

Some of the targets of false information included government heads or authorities of various countries such as Emir of Qatar Tamim bin Hamad Al Thani, Jordanian Queen Rania Abdullah, US President Joe Biden, among others.

A video of the Al Thani threatening to cut-off Qatar’s gas supply to the world if Israel did not stop the bombardment in Gaza was shared widely, BOOM found the claim to be false; the video was over six years old, and did not show Al Thani speaking of Qatar's gas supply at all. Instead, he had said, "The Palestinian cause is a cause of people who faced expulsion from their land and displaced from their country."

Likewise, a viral deepfake video of Jordanian Queen condemning Hamas and extending support to Israel was created by an Israeli Instagram user. However, BOOM found that in the original video, Queen Rania had slammed the West for failing to condemn the high civilian death toll in Gaza caused by Israeli airstrikes.

Use of Graphic Violence

A noticeable feature of the disinformation campaign around the war was the extensive use of graphic videos and images showing explicit violence.

24% of the claims we analysed were shared with such imagery, to induce shock value, and were widely shared on social media.

Full View

Top targets of mis/disinformation of the war

Palestinians (27%) were the top targets of false news around the war, followed by Hamas (21%), Israelis (16%), Muslims and Journalists each accounting for 2%.

Full View

Verified X Handles Spearhead the Disinfo Campaign

Soon after the commencement of the ongoing wave of violence on October 7, 2023, BOOM observed that verified accounts on X (formerly Twitter) took the lead in disseminating disinformation around the topic. 

BOOM had flagged a number of such accounts that had repeatedly shared disinformation around the war, right from the get-go.

Our study of the 100 false claims on this war reveals that 64% of all the claims were shared by at least one verified user on either X, previously known as Twitter.

The platform has been under fire since late 2022, after the controversial take-over by billionaire Elon Musk, and an equally controversial change in platform rules around the 'verified blue tick', which became available for a monthly subscription, instead of the actual identity verification process that the legacy verified tick entailed.

Soon after the 'blue tick' became up for grabs for a monthly subscription, BOOM found many questionable accounts with a history of trolling, and spreading disinformation and hate-content, signing up for Twitter Blue.

This same trend has now been observed throughout the course of the ongoing conflict in the Middle East, with blue tick users on X emerging as most prominent superspreaders of disinformation around the conflict.

Our findings are consistent with those of NewsGuard, who found the these 'blue tick' users on X produced 74 per cent of the unsubstantiated claims related to the conflict on the micro-blogging platform.

Editor's Note: A previous version of the report contained a chart that was erroneously represented as the percentage of verified and unverified users on X who shared disinformation on this topic. The graph actually shows the percentage of false/misleading claims (fact checked by BOOM) on this topic that were found to be shared by at least one verified user on X. The error is regretted.

Tags: