Advertisement
  1. News
  2. Technology
  3. Dozens of AI Nudify Apps found on Google Play and Apple App Store, that create Deepfake images

Dozens of AI Nudify Apps found on Google Play and Apple App Store, that create Deepfake images

A new report revealed that Apple and Google continue to host dozens of apps capable of digitally removing clothes from women by using AI. Despite strict policies banning sexualised and non-consensual content, the apps are still operational and raising serious concerns about online safety & consent.

Tech Transparency Project
Tech Transparency Project Image Source : Tech Transparency Project/X.com
Written By: Saumya Nigam @snigam04
Published: , Updated:
New Delhi:

Apple and Google are under fire yet again after a new report surfaced, stating that both companies are still hosting apps that are capable of using AI to digitally undress women. Publicly, both tech giants have said that they have banned any app that creates sexualised or explicit images, but the investigation says dozens of these apps are still easy to find on their platforms.

How did the news come into the picture?

The Tech Transparency Project, a nonprofit watchdog that keeps tabs on big tech, dug into this issue. Their report comes at a time when people are already worried about AI tools being used to exploit women and kids online.

55 apps on the Play Store and 47 apps on the App Store can strip clothing: Danger!

Here’s what The Tech Transparency Project found out: Researchers turned up 55 apps on the Google Play Store and 47 on Apple’s App Store that can strip clothing from women in photos or videos.

Some apps go all the way and generate nude images; others just swap clothes for bikinis or underwear. And you do not have to hunt for them.

Simply by searching for words like “nudify” or “undress”, they come right up.
It is hard to see how these slipped through the companies’ supposedly strict review processes.

These undressing apps are strangely popular on Google and Apple stores!

These apps are not just lurking in corners; rather, they are shockingly popular.

The report stated that they have racked up over 705 million downloads worldwide and pulled in about USD 117 million.

Since Apple and Google both take a cut from in-app purchases, they are making money off apps that create non-consensual sexual images, which is the exact kind of content they claim to ban.

Google’s developer rules actually say apps that undress people or show sexual nudity are not allowed.

Apple’s guidelines ban anything sexually explicit or disturbing.

But this watchdog group found that plenty of these apps made it through review anyway and stayed up for a long time- breaching the security and commitment made by the companies.

Researchers used AI-generated images

In their tests, the researchers used AI-generated images (not real people), but even the free versions of these apps could turn fully clothed women into sexualised or nude pictures in seconds.

How does this get worse?

Some of these apps are rated as safe for kids as young as nine or marked “suitable for all ages”.

In a lot of cases, users could strip away clothes with just one tap or a simple text command, and there were no warnings or restrictions. The apps didn’t ask for proof of consent or block requests, either.

AI security breach: Apple and Google’s statement

After the report, Apple told CNBC it had taken down 28 apps and warned developers about others. Google said it suspended or removed more than 30 apps after a review.

Even so, the Tech Transparency Project says plenty of similar apps are still out there on both stores.

More stories break about AI tools being used to undress women

Pressure is mounting on big tech as more stories break about AI tools being used to undress women and even children via online mode, which is dangerous.

Regulators in the US and Europe are now looking harder at how these companies enforce their own rules. At the end, the report states that Apple and Google are not just consistent about applying their own policies, leaving people open to harassment, abuse, and humiliation.

 

Read all the Breaking News Live on indiatvnews.com and Get Latest English News & Updates from Technology
Advertisement
Advertisement
Advertisement
Advertisement
 
\