‘Claudia’ offers nude photos for pay. Experts say she’s an AI fake.
Will users feel ripped off as image-generating AI tools fuel a new wave of porn and scams?
The photo shows the face of a young woman with long dark hair and a soft smile who says she is “feeling pretty today
.” And on Reddit — where Claudia, as she’s named, offers to sell nude photos to anyone who privately messages her — she is quite popular: “Holy crap you are beautiful,” one commenter said.
But Claudia is fake — a bundle of surprisingly convincing photos made by artificial-intelligence image tools, possibly deployed to pull in cash from unsuspecting buyers, according to two synthetic-media researchers.
The rapid advances in AI-image generators like Midjourney and Stable Diffusion have gained global attention in recent weeks for their inventive art pieces and impressive fakes of ex-presidents and popes.
But Claudia’s case hints at the technology’s more explicit side: By allowing anyone to create images of fake people that look uncannily real, the tools are reshaping how porn is made and consumed.
New technology has for years been pioneered through porn, and AI-image tools have not broken from that pattern. Thousands of accounts are now registered in discussion boards and chatrooms devoted to the creation and refinement of synthetic people, the majority of whom resemble girls and women — a rapid shift that could upend a multibillion-dollar industry, undermine demand for real-world models and actors and fuel deeper concerns about female objectification and exploitation.
A systems administrator at a hospital in the Midwest — who, like the other AI-porn creators and viewers interviewed for this story, spoke on the condition of anonymity — said he has been using Stable Diffusion tools to create fetish photos of adult women in diapers, and that advances in image quality have made it so their fakeness doesn’t matter.
“The average person who’s looking at this stuff, I don’t think they care,” he said. “I don’t expect the person I’m looking at online to be the person they say they are. I’m not going to meet this person in real life. … At the end of the day, if they’re not real, who really cares?”
The Claudia account didn’t respond to requests for comment, making it impossible to confirm how the photos were made — or how much money they raised from the months-old ruse.
But the researchers said the photos carried several clear hallmarks of a fake, including strange background details and a neck mole that went missing between poses. “Actually rather easy to create,” one AI programmer said.
The researchers identified several online profiles of women they believe are fake avatars based on the telltale artifacts that some AI image generators leave behind. Using profiles on Instagram, Reddit, Twitter and OnlyFans, the accounts shared images of women in varying stages of undress — and told viewers they should pay or subscribe if they wanted to see more.
The suspected fake accounts did not respond to questions. And because most AI-generated images are not watermarked or fingerprinted in any way at the time of creation, it can be challenging for any viewer to confirm whether they’re real or not.