
The Tech Transparency Project (TTP) has followed up on its January report that revealed dozens of “nudify” apps on the App Store with a new investigation focused on how Apple’s own search and ad systems may be helping users find them. Here are the details.
According to the new report, both the App Store and Google Play Store “are helping users to find apps that create deepfake nude images of women,” sometimes through promoted search results and autocomplete search suggestions.
In the report, TTP says Apple and Google are still failing to prevent nudify apps from appearing in their app stores, some of which appear as suitable for minors. The group found that nearly 40% of the top 10 apps returned for searches such as “nudify,” “undress,” and “deepnude” could “render women nude or scantily clad.”
Additionally, some searches surfaced sponsored results for these apps. From the report:
“(…)the first result from an App Store search for “deepfake” was an ad for FaceSwap Video by DuoFace. The app allows users to swap anyone’s face from a still image onto a video. To test the app, TTP uploaded an image of a woman in a white sweater standing on a sidewalk and a video of a topless woman. After first showing a short ad, the app generated a video showing the clothed woman’s face on the nude woman’s body.”
And
“Another App Store search for the term “face swap” yielded an ad for app called AI Face Swap. The app offers preset face swap templates and allows users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of topless woman, and the app swapped their faces with no restrictions.”
Interestingly, in addition to contacting Apple and Google about these findings, TPP also contacted the developers of several of these apps. In at least one instance, the app developer confirmed they were using Grok for image generation, but claimed they “had no idea it was capable of producing such extreme content.” The developer pledged to tighten moderation settings for image generation.
Back to the report, TTP noted that typing “AI NS” as part of a search that could lead to “AI NSFW” prompted the App Store to suggest “image to video ai nsfw.” That search, in turn, returned several nudify apps in the top ten results.
Despite declining to comment on TTP’s request, Apple responded to the report by removing most of the apps TTP identified.
To read TTP’s full report, follow this link.
FTC: We use income earning auto affiliate links. More.
Leave a comment