Apple yanks multiple AI nude-generating apps from the iPhone's App Store

iPhone 15 Pro review back handheld angled camera
(Image credit: Future | Alex Walker-Todd)

Apple has removed three apps from the iPhone's App Store after it was discovered that they could be used to create nonconsensual nude images using the power of AI image generation. The move comes as Apple is heavily rumored to be working on new generative AI features of its own, likely to debut in iOS 18 during the WWDC event in June.

The apps were originally spotted earlier this week and it appears that Apple has only removed them after they were covered online. In fact, a report detailing the news also says that Apple wasn't able to find the apps in question and required help in identifying them before they could be removed.

It's unlikely that any of the generative AI features that Apple is rumored to be working on will be able to do anything like what these apps were doing, but it still makes for an interesting conundrum for Apple. How will it market the features, especially in a world where the public's trust in AI capabilities appears to be on the wane?

Removed

404 Media reports that it was able to find the apps after spotting them in Meta's Ad Library, a feature that archives the ads that are available on its platform. Two of the ads that were found were web-based, but there were three that were for apps that could be downloaded from the App Store. The report says that Meta removed the ads once it was made aware. However, 404 Media says that Apple "did not initially respond to a request for comment on that story, but reached out to me after it was published asking for more information." Then, a day later, Apple confirmed that it had removed three apps from the App Store.

The report also notes that the removal happened "only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself."

Apps similar to those removed by Apple use generative AI to "undress" people by using AI to manipulate an existing photograph to make someone appear as if they were nude. The report notes that these apps, and the images they create, have already found their way into schools across the country. Some students said they found the apps they used on TikTok, but other social networks have also been running ads for similar apps, 404 Media's report notes.

As is so often the case with new technology, the world is currently grappling with the influx of new AI tools and their capabilities. Those capabilities can sometimes be amazing, but othertimes they can be used to do harm as is clearly the case with these apps. Apple will no doubt be keen to ensure that similar apps don't find their way into the App Store once more, although questions will surely be raised about how they were allowed into the store in the first place.

More from iMore

Oliver Haslam
Contributor

Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too. Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.