Post

The Million-Dollar Business of AI 'Nudify' Websites: A Deep Dive

Discover how AI 'nudify' websites are making millions by exploiting US technologies and compromising privacy.

The Million-Dollar Business of AI 'Nudify' Websites: A Deep Dive

TL;DR

Millions of people are accessing harmful AI “nudify” websites, which are generating substantial revenue by leveraging technology from US companies. These sites raise significant ethical and privacy concerns, highlighting the need for stricter regulations and public awareness.

The Rise of AI ‘Nudify’ Websites

Millions of people are accessing harmful AI “nudify” websites, which are generating substantial revenue by leveraging technology from US companies. These sites raise significant ethical and privacy concerns, highlighting the need for stricter regulations and public awareness.

Understanding the Phenomenon

AI “nudify” websites utilize advanced technologies to create fake nude images of individuals, often without their consent. This practice, known as “deepfake” technology, has become increasingly prevalent and problematic. The motivations behind creating these images range from curiosity and sexual gratification to the stigmatization or embarrassment of the subject, and commercial gain through the sale of these images on pornographic websites.

Historical Context

The market for fake nude photography has evolved significantly over the years. In the 1990s and 200s, magazines like Celebrity Skin published paparazzi shots and illicitly obtained nude photos, indicating a demand for such content. Websites hosting fake nude or pornographic photos of celebrities, often referred to as “celebrity fakes,” proliferated on Usenet and other platforms. This led to campaigns aimed at taking legal action against the creators of these images and websites dedicated to verifying the authenticity of nude photos 1.

Technological Advancements

In recent years, deepfake technology has revolutionized the creation of fake nude images. Deepfakes use artificial neural networks to superimpose one person’s face onto another’s body in images or videos. This technology gained popularity in the late 2010s, raising concerns about its use in fake news and revenge porn 1.

DeepNude Application

In June 2019, the DeepNude application was released, utilizing a Generative Adversarial Network (GAN) to remove clothing from images of women. The app had both paid and unpaid versions, but it was quickly removed by its creators due to ethical concerns. However, various copies of the app continue to circulate online 1.

Deepfake Telegram Bot

In July 2019, a deepfake bot service was launched on Telegram, allowing users to submit photos and receive manipulated nude images within minutes. This service was connected to seven Telegram channels, with the main channel having over 45,000 members. By July 2020, it was estimated that approximately 24,000 manipulated images had been shared across these channels 1.

The creation and distribution of fake nude images raise serious ethical and legal issues. These practices often target prominent figures such as businesspeople or politicians, causing significant harm to their reputations and personal lives. The commercial gain from these activities highlights the need for stricter regulations and legal actions to protect individuals’ privacy and dignity.

Conclusion

The proliferation of AI “nudify” websites and deepfake technology poses a significant threat to privacy and ethical standards. As these technologies continue to evolve, it is crucial for societies to address these issues through legislation, education, and public awareness campaigns.

References

  1. (2023, July 14). Fake nude photography. Wikipedia. Retrieved 2025-07-14. ↩︎ ↩︎2 ↩︎3 ↩︎4

This post is licensed under CC BY 4.0 by the author.