Google Clamps Down On Deepfake Porn Ads In Policy Update
Google's latest policy update prohibits advertising services that assist users in generating “sexually explicit content,” which it defines as “text, image, audio, or video of graphic sexual acts intended to arouse”.
The services could either involve modifying an individual's image or creating a new one.
Effective May 30, the update prohibits the promotion of synthetic content altered or created to feature sexual explicitness or nudity. This includes websites and apps providing guidance on crafting deepfake porn.
While Google has long banned the promotion of sexually explicit content by advertisers, certain apps facilitating the production of deepfake pornography have circumvented this restriction.
They achieve this by advertising themselves as non-sexual entities on Google ads or within the Google Play store.
In 2023, Google removed over 1.8 billion ads for violating its policies on sexual content, according to the company’s annual Ads Safety Report.
ECI Cracks Down On Deepfakes, Sets 3-Hour Deadline For Fake Content Removal