Hindu nationalist groups, including supporters of the Bharatiya Janata Party, had allegedly misused video hosting giant YouTube to target Muslims and women across the country using conspiracy theories and threatening videos, found a recent report by NYU Stern School of Business and Human Rights.
The report, titled 'A Platform "Weaponised": How YouTube Spreads Harmful Content – And What Can Be Done About It', looked at how the Google-owned platform has been consistently abused to spread political and health-related disinformation, along with content that incited violence.
It also provides a series of recommendations to YouTube to make the platform a safer place for its viewers, and to reduce its potential for abuse.
While a major portion of the report discusses the issues pertaining to the United States, it does include a considerable section dedicated to the observations on how the platform has been weaponised in India - the companies biggest market globally with about 450 million users.
A Dichotomy Of Better Access, And Harmful Content
YouTube has been a boon for many independent journalists, entertainers, musicians, and many others around the world, by removing the barriers to entry in the television industry. The ease of putting videos online at will, has also led to a surge in harmful and dangerous content on the platform.
To exemplify this dual nature of the platform, it mentioned the role of informing Russians with accurate information on the war in Ukraine, while also playing a crucial hand in spreading disinformation on Ukraine.
"Since Russia launched its invasion of Ukraine in February 2022, YouTube has offered ordinary Russians factual information about the war, even as the Kremlin has blocked or restricted other Western-based social media platforms and pressured foreign journalists in the country to silence themselves. But for years before the brutal incursion, YouTube served as a megaphone for Vladimir Putin's disinformation about Ukraine and its relations with the West."
The scenario is not so different in India.
Speaking to the NYU researchers, BOOM's deputy editor Karen Rebelo noted that YouTube, like other social media platforms has "democratized the news publishing landscape, giving space to independent journalists and small and local organizations with limited resources."
"But it has also diluted the quality of journalism and allowed the rise of propaganda to masquerade as news. One now-common tactic occurs during major news events, natural calamities, terrorist attacks, and military conflicts, where old and unrelated videos are shared out-of-context on YouTube," she added.
Spread Hate, Get Paid
The report repeatedly mentioned how the platform has provided a voice to those who spread hate and misinformation, especially through right-leaning content. In South Korea, it was abused by organised misogynists, in Brazil by far-right ideologues, and in Myanmar by supporters of the oppressive military regime.
In India, YouTube became a power tool for right-leaning Hindu nationalists to target Muslims and women, the report noted.
"The most troubling abuse of YouTube in India involves the targeting of Muslims by backers of the ruling Bharatiya Janata Party and other right-leaning Hindu nationalist groups."
The report specifically brought up the issue of how YouTube was used to fuel violence against Muslim merchants around the country by propagating the conspiracy theory that Muslims in the country were purposefully spreading the virus by spitting on people's food, as a form of biological warfare, dubbed as 'Corona Jihad'.
This conspiracy theory was also furthered by many mainstream news channels. The NYU report highlighted how right-leaning media outlet News Nation, along with others like ABP News and Zee News, have used their YouTube channels with millions of subscribers to fuel this conspiracy theory.
In May 2020, BOOM analysed 178 COVID-related fact-checks done in the first few months of the pandemic, to find that a bulk of them spread false allegations against Muslims of carrying out 'Corona Jihad'.
Also Read | Fake News In The Time Of Coronavirus: A BOOM Study
The report also highlighted the spread of Islamophobic misogyny through YouTube in India, with a surge in videos containing online attacks on Muslim women.
It touched upon the issue of 'Sulli Deals' - a GitHub app which was used to harrass Muslim women on social media by revealing information on their social profiles, and putting them up for 'sale' in the form of bidding wars. This specific type of harassments became popular after a YouTube live video conducted by YouTube channel 'The Liberal Doge', run by far-right influencer Ritesh Jha, became viral on the platform.
While on one hand YouTube has taken actions against those like Jha by taking down their channels and banning them from the platform, it has, on the other hand, provided lucrative opportunities for others to spread similar content through monetisation.
"Moreover, YouTube's extensive program for sharing advertising revenue with popular creators means that purveyors of misinformation can make a living while amplifying the grievances and resentments that foment partisan hatred, particularly on the political right."
This portrays a business model that depends on such content in the pursuit of profit - an issue that is ubiquitous to the social media industry.
The Link Between Outrage, Engagement And Revenue
The fact that those spreading disinformation, along with hateful and dangerous content, are making money off YouTube, suggests that the money is highly dependent on engagement, which often peaks during instances of outrage.
The NYU report, along with many other past studies on social media engagement have highlighted how companies like YouTube and Facebook depend on advertisement revenues for a major chunk of their profits, with these advertisers pursuing maximum attention from users in return.
To demonstrate that their level of attention given by their users, platforms like YouTube have prioritised 'engagement', a metric which reflects the amount of attention given by users through watch time, likes, shares and comments.
"But here's the problem: Content that tends to heighten engagement often does so because it provokes emotions like anger, fear, and resentment.1 So, when software engineers design algorithms to rank and recommend content, the automated systems favor posts that stir these negative emotions."
It is this very business model that makes it profitable to spread content that would spread hatred and fear, and cause constant outrage - they keep the users glued to the screens, which is good for advertisement revenue, a portion of which lands up in the pockets of the most popular creators.
Recommendations To Improve?
While YouTube has taken steps to crack down on such content, it has been large limited to the English language. In India, disinforming videos, hate speech and other dangerous content in regional languages have often gone unchecked.
Furthermore, the report notes that YouTube as a platform has been far less studied, as compared to Facebook and Twitter.
"There are several reasons that less is known about YouTube than Facebook or Twitter. First, it is more difficult and expensive to analyze a large volume of videos than it is to search for words or phrases in a text data set of Facebook or Twitter posts. Budget-conscious academic researchers weigh the feasibility of competing projects; dissecting video costs a lot more in human hours and computer time."
The report makes the following recommendations for the platform to make it safer for the public:
1. Disclose more information about how the platform works: A place to start is explaining the criteria algorithms use to rank, recommend, and remove content—as well as how the criteria are weighted relative to one another.
2. Facilitate greater access to data that researchers need to study YouTube: The platform should ease its resistance to providing social scientists with information for empirical studies, including random samples of videos.
3. Expand and improve human review of potential harmful content: YouTube's parent company, Google, says that it has more than 20,000 people around the world working on content moderation, but it declines to specify how many do hands-on review of YouTube videos. Whatever that number is, it needs to grow, and outsourced moderators should be brought in-house.
4. Invest more in relationships with civil society and news organizations: In light of their contribution to the collapse of the advertising-based business model of many U.S. news-gathering organizations, the platforms should step up current efforts to ensure the viability of the journalism business, especially at the local level.