Massive Collection of Nude Photos Unveiled Due to AI Image Generator Startup’s Security Flaw

A startup specializing in AI image generation has inadvertently exposed over 1 million images and videos created by its systems, making them accessible to anyone online, as revealed by new research reviewed by WIRED. According to the researcher who uncovered this open cache of data, the “vast majority” of these images involved nudity and portrayed “adult content,” with some appearing to feature children or faces of children imposed onto AI-generated bodies of nude adults.
Security researcher Jeremiah Fowler notes that multiple websites—including MagicEdit and DreamPal—seem to share the same unsecured database. Fowler discovered this security flaw in October, stating that around 10,000 new images were being added daily at that time. Reflecting potential misuse of the image-generation and editing tools, these images included “unaltered” photos of real individuals who may have been nonconsensually “nudified,” or had their faces swapped onto other naked bodies.
“The primary concern is about innocent individuals, particularly minors, having their images exploited without their consent to create sexual content,” says Fowler, a frequent investigator of exposed databases, who shared his findings on the ExpressVPN blog. He mentions that this is the third misconfigured AI-image-generation database he has identified available online this year, all seemingly containing nonconsensual explicit imagery, including those of minors.
Fowler’s discoveries come amid rising misuse of AI-image-generation tools to create explicit imagery of individuals. A vast network of “nudify” services, utilized by millions and generating substantial revenue annually, employs AI to “strip” clothing from individuals—predominantly women—in photos. Stolen social media images can be altered in a few clicks, leading to severe abuse and harassment of women. Additionally, instances of criminals leveraging AI to produce child sexual abuse material, encompassing a spectrum of indecent images involving minors, have doubled in the past year.
“We take these issues very seriously,” states a representative for a startup called DreamX, which oversees MagicEdit and DreamPal. The spokesperson asserts that a related influencer marketing firm known as SocialBook operates “under a separate legal entity and is not involved” in the functioning of other sites. “Although these entities share some historical links through founders and legacy assets, they function independently with distinct product lines,” the spokesperson adds.
“SocialBook is not associated with the database you mentioned, does not utilize this storage, and has never been involved in its operation or management,” a SocialBook spokesperson tells WIRED. “The images cited were neither generated, processed, nor stored by SocialBook’s systems. SocialBook operates independently and has no connection to the infrastructure described.”
