From Mali to Australia, low-cost YouTube videos that mimic serious news channels have become a key tool for spreading disinformation and monetising clickbait in multiple languages.
Unlike so-called "deepfakes", which use sophisticated technologies to subtly manipulate audiovisual content and make it seem as realistic as possible, these formats are cheaply produced and tend to follow the same simple formula.
They open with a jingle and flashy graphics modelled on real current affairs bulletins, before displaying a quick succession of clips and photos.
A robotic voice describes the events allegedly seen in the images, often accompanied by garbled subtitles and tacky animations.
Churned out at an industrial pace on YouTube, the videos are then shared in Facebook groups and pages with tens of thousands of followers.
While the creators of the videos are hard to trace, experts say their goals may range from sowing confusion and stoke political tension to gaining clicks and making money.
"What it's trying to do is distract you with images so that you don't pay too much attention to the audio that's playing," said Shyam Sundar, founder of the Media Effects Research Lab at Pennsylvania State University.
"People are not going to strictly scrutinise the content of that audio monologue so they're less likely to call out any misinformation or question any information".
"This is a deadly combination... bombarded with so many pieces of information simultaneously, your brain is trying to cope with all the stimuli and audio on top."
'Spam and scams'
One recent example alleged that Russian paramilitary group Wagner had inaugurated a base in Mali, which is battling an Islamist insurgency and intercommunal violence.
The claim surfaced amid reports that Bamako was considering hiring Wagner mercenaries after France announced it would reduce its military presence in the Sahel.
The move fanned international fears of Russia's growing influence in the region.
The seven–minute video in French has been viewed more than 37,000 times since it was first published in November on a YouTube channel called "Africa24 Infos".
But the story was false: AFP Fact Check found that the pictures were either taken out of context or outright doctored.
Another video in English falsely claims to show Australia destroying Chinese fishing boats.
In reality, the computerised voice recited an article from a maritime news site about Australia intercepting illegal fishing vessels from Indonesia, swapping "Indonesian fishing boats" for "Chinese fishing boats".
The clip, posted on the "Today News Post" channel on October 25, has so far drawn over 25,000 views.
Many of the YouTube channels, found across Africa and beyond, mix disinformation with real news to blur the lines and seem more credible.
The videos are "easy to make and diffuse en masse", said disinformation expert Sebastian Dieguez of the University of Fribourg in Switzerland.
Production costs are low: people only need a script and voice synthesis software. There are even free programmes that automatically generate YouTube videos.
"Similar to spam messages and scams, they try to cast the widest net possible," Dieguez told AFP Fact Check.
He likened the phenomenon to the popularity of US conspiracy movement QAnon, which in recent years has crept from the fringes of social media into the mainstream thanks to widely shared posts on platforms like Facebook and Instagram.
"It's a new conspiracy style -- a cryptic message which relies on users interpreting it in an active, participatory way... you're the hero in this."
Who's behind it?
Beyond the political component, the clips also serve to make money.
"If it's controversial content, it's going to get clicked upon and shared," media expert Sundar said.
"This is how the fake news epidemic started in 2016: a bunch of Macedonian teenagers figured out this formula. For them, the incentive was economic."
Five years ago, Macedonia became an unlikely epicentre for producing and disseminating mass misinformation to support the US election campaign of Donald Trump.
For now, the producers behind the low-cost videos remain mostly in the dark. Experts say sources could range from governments and Russian trolls to people just looking to cash in on clickbait.
"These YouTube videos disseminated through Facebook accounts remind me of earlier Russian trolls' techniques I uncovered in 2014 and 2015," Finnish investigative journalist Jessikka Aro told AFP Fact Check.
"You cannot attribute all material or accounts directly to the 'troll factory', or the Kremlin defence ministry, or the Russian military service. And that's the whole point!"
(Except for the headline, this story has not been edited by BOOM staff and is published from a syndicated feed.)