{"id":40161,"date":"2025-12-05T11:51:38","date_gmt":"2025-12-05T11:51:38","guid":{"rendered":"https:\/\/agooka.com\/news\/business\/huge-trove-of-nude-images-leaked-by-ai-image-generator-startups-exposed-database\/"},"modified":"2025-12-05T11:51:38","modified_gmt":"2025-12-05T11:51:38","slug":"huge-trove-of-nude-images-leaked-by-ai-image-generator-startups-exposed-database","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/business\/huge-trove-of-nude-images-leaked-by-ai-image-generator-startups-exposed-database\/","title":{"rendered":"Huge Trove of Nude Images Leaked by AI Image Generator Startup\u2019s Exposed Database"},"content":{"rendered":"<p>Save StorySave this storySave StorySave this story<\/p>\n<p>An AI image generator startup left more than 1 million images and videos created with its systems exposed and accessible to anyone online, according to new research reviewed by WIRED. The \u201coverwhelming majority\u201d of the images involved nudity and were \u201cdepicted adult content,\u201d according to the researcher who uncovered the exposed trove of data, with some appearing to depict children or the faces of children swapped onto the AI-generated bodies of nude adults.<\/p>\n<p>Multiple websites\u2014including MagicEdit and DreamPal\u2014all appeared to be using the same unsecured database, says security researcher Jeremiah Fowler, who discovered the security flaw in October. At the time, Fowler says, around 10,000 new images were being added to the database every day. Indicating how people may have been using the image-generation and editing tools, these images included \u201cunaltered\u201d photos of real people who may have been nonconsensually \u201cnudified,\u201d or had their faces swapped onto other, naked bodies.<\/p>\n<p>\u201cThe real issue is just innocent people, and especially underage people, having their images used without their consent to make sexual content,\u201d says Fowler, a prolific hunter of exposed databases, who published the findings on the ExpressVPN blog. Fowler says it is the third misconfigured AI-image-generation database he has found accessible online this year\u2014with all of them appearing to contain nonconsensual explicit imagery, including those of young people and children.<\/p>\n<p>Fowler\u2019s findings come as AI-image-generation tools continue to be used to maliciously create explicit imagery of people. An enormous ecosystem of \u201cnudify\u201d services, which are used by millions of people and make millions of dollars per year, uses AI to \u201cstrip\u201d the clothes off of people\u2014almost entirely women\u2014in photos. Photos stolen from social media can be edited in just a couple of clicks: leading to the harrowing abuse and harassment of women. Meanwhile, reports of criminals using AI to create child sexual abuse material, which covers a range of indecent images involving children, have doubled over the past year.<\/p>\n<p>\u201cWe take these concerns extremely seriously,\u201d says a spokesperson for a startup called DreamX, which operates MagicEdit and DreamPal. The spokesperson says that an influencer marketing firm linked to the database, called SocialBook, is run \u201cby a separate legal entity and is not involved\u201d in the operation of other sites. \u201cThese entities share some historical relationships through founders and legacy assets, but they operate independently with separate product lines,\u201d the spokesperson says.<\/p>\n<p>\u201cSocialBook is not connected to the database you referenced, does not use this storage, and was not involved in its operation or management at any time,\u201d a SocialBook spokesperson tells WIRED. \u201cThe images referenced were not generated, processed, or stored by SocialBook\u2019s systems. SocialBook operates independently and has no role in the infrastructure described.\u201d<\/p>\n<p>In his report, Fowler writes that the database indicated it was linked to SocialBook and included images with a SocialBook watermark. Multiple pages on the SocialBook website that previously mentioned MagicEdit or DreamPal now return error pages. \u201cThe bucket in question contained a mix of legacy assets, primarily from MagicEdit and DreamPal. SocialBook does not use this bucket for its operational infrastructure,\u201d the DreamX spokesperson says.<\/p>\n<p>\u201cOur priority is the safety of users and the public, adherence to all legal requirements, and complete transparency throughout this process,\u201d the DreamX spokesperson adds. \u201cWe do not condone, support, or tolerate the creation or distribution of child sexual abuse material (\u2018CSAM\u2019) under any circumstances.\u201d<\/p>\n<p>After Fowler got in touch with the AI-image-generator firm, the spokesperson says, it closed access to the exposed database and launched an \u201cinternal investigation with external legal counsel.\u201d It also \u201csuspended access to our products pending the investigation\u2019s outcome,\u201d the spokesperson says. The MagicEdit and DreamPal websites and mobile applications were accessible until WIRED got in touch with those who run it.<\/p>\n<p>At the time of writing, the DreamPal website is unavailable, returning a 502 error. \u201cWe are temporarily suspending certain features of the product,\u201d a message on the homepage of the MagicEdit website says. \u201cDuring this period, the service may be unavailable.\u201d Another associated website also displays the same message. Both MagicEdit and DreamPal were listed as being owned by the developer BoostInsider on Apple\u2019s iOS App Store. MagicEdit, DreamPal, and two other AI apps listed by BoostInsider are now no longer available on the App Store.<\/p>\n<p>The DreamX spokesperson says Boostinsider is a \u201cdefunct entity,\u201d and the company \u201ctemporarily removed\u201d the apps as \u201cpart of a broader restructuring of our product lines and infrastructure\u201d and it is \u201cstrengthening our content-moderation framework.\u201d<\/p>\n<p>The apps do not seem to appear on Google\u2019s Play Store. However, when a BoostInsider account questioned why it had two apps, including MagicEdit, suspended by Google earlier this year on support pages, a Google community \u201cexpert\u201d account replied, saying the apps included \u201csexually explicit content\u201d or nudity. A Google spokesperson confirmed that the apps had been suspended due to policy violations. An Apple spokesperson said the apps have been removed from the App Store.<\/p>\n<p>The exposed database Fowler discovered contained 1,099,985 records, the researcher says, with \u201cnearly all\u201d of them being pornographic in nature. Fowler says he takes a number of screenshots to verify the exposure and report it to its owners but does not capture illicit or potentially illegal content and doesn\u2019t download the exposed data he discovers. \u201cIt was all images and videos,\u201d Fowler says, noting the absence of any other file types. \u201cThe exposed database held numerous files that appeared to be explicit, AI-generated depictions of underage individuals and, potentially, children,\u201d Fowler\u2019s report says.<\/p>\n<p>Fowler reported the exposed database to the US National Center for Missing and Exploited Children, a nonprofit that works with tech companies, law enforcement, and families on child-protection issues. A spokesperson for the center says it reviews all information its CyberTipline receives but does not disclose information about \u201cspecific tips received.\u201d<\/p>\n<p>Overall, some images in the database appeared to be entirely AI, including anime-style imagery, while others were \u201chyperrealistic\u201d and appeared to be based on real people, the researcher says. It is unclear how long the data was left exposed on the open internet. The DreamX spokesperson says \u201cno operational systems were compromised.&quot;<\/p>\n<p>The MagicEdit website, while it was online, did not appear to explicitly say it could be used to create explicit images of adults. However, Fowler writes in his report that its rating on Apple\u2019s App Store was listed as 18+. Its homepage also featured an AI-generated image of a woman in a dress, which changes to a bikini. The website listed multiple \u201cAI tools\u201d people could use\u2014ranging from \u201ctext to video\u201d and video background removers, to a \u201cmagic eraser,\u201d face swapping, and expanding an image with AI\u2014with some features locked behind a \u201cpro\u201d mode requiring payment.<\/p>\n<p>MagicEdit also listed an \u201cAI Clothes\u201d tool. Many of the \u201cstyles\u201d of image-generation tools listed on its website showed sexualized images of women and often involved depicting them with fewer clothes on\u2014sometimes wearing bikinis or underwear\u2014once AI had been applied. \u201cWatch this outfit go from everyday casual to sexy in seconds,\u201d a post on MagicEdit\u2019s now-removed Instagram account said.<\/p>\n<p>\u201cThey\u2019ve done a great way of subtly promoting sexualized content,\u201d Fowler says, noting that AI tools that depict nudity can easily be \u201cweaponized\u201d for blackmail, harassment, and other malicious purposes. \u201cThese companies really have to do more than just a generic pop-up: \u2018By clicking this, you agree that you have consent to upload this picture.\u2019 You can\u2019t let people police themselves, because they won\u2019t. They have to have some form of moderation that even goes beyond AI.\u201d<\/p>\n<p>\u201cMagicEdit does not promote or encourage explicit sexual content, and we enforce moderation, filtering, and safeguarding mechanisms to prevent misuse,\u201d the DreamX spokesperson says. \u201cFrom a technical standpoint, we implemented multiple safeguards\u2014well before receiving any external inquiry\u2014including prompt regulation, input filtering, and mandatory review of all user prompts through OpenAI\u2019s Moderation API,\u201d the spokesperson adds. \u201cIf a prompt violates safety standards, the system blocks the request automatically.\u201d<\/p>\n<p>\u201cThis is the continuation of an existing problem when it comes to this apathy that startups feel toward trust and safety and the protection of children,\u201d says Adam Dodge, the founder of EndTAB (Ending Technology-Enabled Abuse), which provides training to schools and organizations to help tackle tech tech abuse.<\/p>\n<p>Meanwhile, the DreamPal website\u2014which described itself as an \u201cAI roleplay chat\u201d\u2014was more explicit in its adult nature. Its web pages said people could \u201ccreate your dream AI girlfriend.\u201d Some links on the site, likely designed for SEO purposes, referenced \u201cAI Sexing Chat,\u201d \u201cTalk Dirty AI,\u201d and \u201cAI Big Tits.\u201d An FAQ on the bottom of the DreamPal website said: \u201cWe\u2019ve removed any NSFW AI chat filters that could hold you back from expressing your most intimate fantasies.\u201d<\/p>\n<p>\u201cEverything we\u2019re seeing was entirely foreseeable,\u201d Dodge says. \u201cThe underlying drive is the sexualization and control of the bodies of women and girls,\u201d he says. \u201cThis is not a new societal problem, but we\u2019re getting a glimpse into what that problem looks like when it is supercharged by AI.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Save StorySave this storySave StorySave this story An AI image generator startup left more than 1 million images and videos created with its systems exposed and accessible to anyone online, according to new research reviewed by WIRED. The \u201coverwhelming majority\u201d of the images involved nudity and were \u201cdepicted adult content,\u201d according to the researcher who [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":40162,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[],"class_list":{"0":"post-40161","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/40161","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=40161"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/40161\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/40162"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=40161"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=40161"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=40161"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}