Google, OpenAI, and Allies Team Up to Tackle CSAM Using AI Technology
Google, OpenAI, Discord and Roblox have launched a new nonprofit to improve child safety online. The initiative is called Robust Open Online Safety Tools (ROOST).
ROOST touts to make core safety technologies more accessible. It will provide free, open-source AI tools to detect, review, and report child sexual abuse material (CSAM).
Though the details on the CSAM detection tools remain limited, the project will use large language AI models and ‘unify’ existing safety measures.
This move comes amid a major regulatory battle in the U.S. over child safety on social media as companies are adopting self-regulation.
The National Centre for Missing and Exploited Children (NCMEC) reported a 12% rise in suspected child exploitation from 2022 to 2023.
By 2020, over half of U.S. children used Roblox, which has faced criticism for not preventing child sexual exploitation. In 2022, a lawsuit accused Roblox and Discord of allowing unsupervised adult messaging with children.
LGBTQ+ Mentions Scrubbed from US National Missing Children Database
Click here