Blockchain

AI-Powered “Nudify” Platform: A Deep Dive into the Amazing Surge and Its Impact

Recent findings from Graphika, a company specializing in social network research, have revealed a troubling trend. That said, the use of artificial intelligence (AI), which can digitally undress people in photos, with a primary focus on women, is advancing exponentially. In the month of September alone, this phenomenon, also known as “Nudify” or “Undressing” services, reported that more than 24 million users engaged with these platforms, raising serious concerns about privacy and safety.

These platforms can use powerful artificial intelligence algorithms to replace clothing in photos with nudes, making gender-based digital harassment even worse. In this way, the act of altering a photo without the subject’s consent not only causes serious damage to the mind and reputation, but also raises serious ethical and legal issues. The number of advertising links posted on platforms such as Reddit has increased by 2,400% since the beginning of the year, driven by the marketing practices of these platforms, which often leverage social networks.

The proliferation of Nudify applications has brought to light several important issues. These issues include invasions of privacy, concerns about autonomy, and the perpetuation of harmful stereotypes and objectification of women. These tools contribute to arrangements that are not made without the individual’s consent, which can lead to increased incidents of sexual harassment and assault. In addition to privacy concerns, this technology allows the creation of deepfakes and synthetic media that pose significant risks to users’ safety while they are online and contribute to the spread of disinformation.

Defending against this escalating risk requires focused efforts across multiple fronts. Advertisements for Nudify applications should be identified and removed from social media sites, and governments should be encouraged to seek the passage of legislation banning the use of such apps. Additionally, research institutions and technology companies should create tools and methods to identify and prevent the creation of nude photos by artificial intelligence.

Apps like DeepSukebe, which promise to “unveil the truth hidden beneath your clothes”, have been particularly problematic because they allow users to create revealing nude photos without their consent and become tools for harassment and exploitation. . Despite ethical considerations, there is a clear need for such tools, as evidenced by the significant monthly search volume for terms related to the topic.

According to a study published by Graphika in December 2022, more than 24 million unique people viewed a set of 34 undressing websites and applications in September. This information provides insight into the scale of the problem. Despite the fact that companies like TikTok and Meta Platforms Inc. have taken steps to address the issue, there is an immediate and urgent need for more comprehensive industry-wide initiatives to counter the advancement of AI-generated advertising. There is deepfake porn

Image source: Shutterstock

Related Articles

Back to top button