Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
No Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
No Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Explainers

83% Indians Fell Prey To AI Voice Scams: McAfee Report

A McAfee report suggests that 69% of Indians are unable to distinguish between a genuine human voice and an AI-generated voice.

By - Hera Rizwan | 3 May 2023 6:08 PM IST

India has the highest number of people who fell victim to AI-generated voice scams, a study titled 'Beware the Artificial Impostor' by McAfee has found.  About 83% of Indians have lost money to such scams, the report said. Additionally, 69% of Indians are not confident if they could identify the cloned AI voice meant for scamming, from the real human voice. 

Amidst the increasing cybercrimes exploiting AI, attackers are now using AI-based voice technology to defraud people. The recent study conducted by global computer security software McAfee found that fraudsters are using Artificial Intelligence to mimic voices of anxious family members, and a large number of Indians are falling victims of such frauds.

How does the AI voice-cloning works?

Cloning someone's voice is now a potent weapon in the hands of a cyber criminals. According to the McAfee study, 53% of adults share their speech data online at least once a week through social media, voice notes, and more.Forty-nine percent of people do so up to 10 times a week. The practice is most common in India, with 86% of people making their voices available online at least once a week, followed by the U.K. at 56%, and the U.S. at 52%.

It might seem like a harmless activity meant to ease communication. But the voices leave behind digital footprint which can be misused by cyber-criminals to target people. A small snippet of voice can be used to create a believable clone, using AI, that can be manipulated for fraudulent purposes, the study said.  

The scammers invent scenarios to manipulate people into believing that someone close to them is in dire need of money. According to the study "some scenarios are more likely to be successful than others". Some of the scenarios which work well include car issues or accident, theft victim, lost wallet and catering to someone who is traveling abroad and needed help.

What are the key takeaways from the study?

-The study was conducted by McAfee with 7,054 people from seven countries, including India. According to the study, 47% of the Indian adults have experienced or know someone who has been a victim of AI voice scam. The study was conducted between April 13, 2023 to April 19, 2023.  

- According to McAfee, the AI voice-cloning tools are capable of replicating a person's voice with up to 95% accuracy. As per the study, 69% of Indians are not confident that they can identify the cloned version of a voice from the real voice.

-More than a third of the individuals who participated in the study lost over $1,000, while 7% were duped of between $5,000 and $15,000. The number was highest in the U.S., where more than one in 10 victims lost between $5,000–$15,000. The cost of falling for an AI voice scam is also very significant in case of India, with 83% Indians losing money and 48% of them lost up to Rs 50,000.








Tags: