Photo by Pathum Danthanarayana on Unsplash

Use of AI Apps to Undress Women in Photos on the Rise

December 8, 2023

Apps that create deepfake nudes, which are digitally altered images that make a person appear to have no clothing on, are typically created without a person’s consent. These apps have been around for many years, and their main targets appear to primarily be celebrities and other famous personalities.

According to the social network analysis company Graphika, “The creation and dissemination of synthetic non-consensual intimate imagery (NCII) has moved from a custom service available on niche internet forums to an automated and scaled online business that leverages a myriad of resources to monetize and market its services.”

Graphika stated that the people behind synthetic NCII — known as “undressing” images — manipulate photos and video footage of real people. These types of creations are enabled by easy-to-use AI tools within apps that can “undress” people in photographs without their approval. The apps analyze what the subject’s naked body would look like and impose it into an image or seamlessly swap a face into a pornographic video.


Graphika’s report claims, “A group of 34 synthetic NCII providers identified by Graphika received over 24 million unique visitors to their websites in September, according to data provided by web traffic analysis firm Similarweb. Additionally, the volume of referral link spam for these services has increased by more than 2,000% on platforms including Reddit and X since the beginning of 2023.”

The company believes that the increasing accessibility of these online photo manipulation tools may lead to online harm for innocent victims. These include the uploading of non-consensual nude images, harassment, sextortion, and even the generation of child sexual abuse material.

According to The Washington Post, the FBI warned in June of an increase in “sexual extortion from scammers demanding payment or photos in exchange for not distributing sexual images.” The FBI told the Post that as of September 2023, over 26,800 people have been victims of sextortion campaigns, a rise of 149% since 2019. However, it has not yet been determined if these are consensual photos or AI-generated pictures.


Time Magazine reported that there is currently no federal law that bans the production of deepfake pornography. However, the U.S. government outlaws the creation of these types of images of minors. Last month, a North Carolina child psychiatrist was sent to 40 years in prison for using the apps to undress photos of his patients — the first prosecution of its kind.

Recent News

China’s Property Market Boost: Stocks Surge, Copper Hits Highs

The Chinese government unveiled a comprehensive support package that has sent ripples through financial markets. This initiative includes a slew of measures aimed at stimulating housing demand and addressing the excess inventory burdening developers. The immediate impact was a significant rally in Chinese stocks and a surge in commodity prices, notably copper, which hit record highs.