OpenAI Announces Its New Tool That Can Detect AI-Generated Images
OpenAI introduced a novel tool for identifying whether an image originates from its DALL-E AI image generator, along with enhanced watermarking techniques to better highlight the content it produces.
In its blog post, the company said that it is actively working on new provenance techniques to trace content origins and validate whether it was created by AI.
Among these advancements are a novel image detection classifier, employing AI to discern AI-generated photos, along with a tamper-resistant watermark capable of discreetly tagging content like audio with imperceptible signals.
The classifier assesses the probability of a picture being generated by DALL-E 3. OpenAI asserts its effectiveness even when images are cropped, compressed, or saturation is altered.
While it accurately detects images made with DALL-E 3 with around 98% accuracy, its ability to identify content from other AI models is less robust, flagging only 5 to 10% of images from other generators like Midjourney.
OpenAI says it needs to get feedback from users to test its effectiveness. Currently, the tool is only open for researchers and nonprofit journalism groups to test.
AstraZeneca Withdraws COVID-19 Vaccine Globally Weeks After Rare Side Effects Concerns