TikTok user rache.lzh5 presents herself to her nearly 50,000 followers in near tears. She doesn’t know how to describe what happened to her in the past 48 hours, she says. Two days ago, an anonymous account sent her nude photos. The bizarre thing: she took the photos herself – fully clothed.
The video has now been viewed more than a million times. In addition to expressions of solidarity, the comments also include women reporting similar experiences. Many have fallen victim to deep nudes, a particularly perfidious form of deepfake.
Deepfakes have already been made of famous personalities such as Will Smith, Olaf Scholz, Donald Trump and the Pope – recordings that have been alienated by artificial intelligence. These have been regularly distributed on the Internet for several years. Many can be quickly exposed as fakes. Some are even quite funny. But some counterfeits are deceptively genuine and can have serious consequences.
For example, in March 2022, a video of an alleged surrender call by Ukrainian President Zelenskyy circulated on Facebook. A counterfeit, the company later announced. Such counterfeiting can also wreak havoc in other areas.
In the past, Europol, among others, warned against the use of deepfakes by criminals. In particular, extortion and fraud or document forgery were mentioned. But there was also a warning about Deepnudes in this regard.
Because the Deepnude technology, of which the TikTok user also fell victim, is far from new. In 2019, a developer named “Alberto” released a tool called Deepnude. The application promised users to generate nude photos from images of clothed women.
The application caused a violent shit storm online. Only a few days after publication, “Alberto” took the software offline again. It had become clear to him that the opportunities for abuse were too great, he said on Twitter at the time.
However, the insight came too late: the technology spread through the Internet and similar programs soon spread. Since then, Deepnude creations have been exchanged diligently, especially in Telegram groups.
With the renewed proliferation of AI programs, deep nude technology is now enjoying greater popularity outside of Telegram. While random sampling from the BBC showed that the software’s results aren’t particularly realistic, this probably isn’t much comfort to those affected. Only the creator and the person depicted know that the photos are fake, but may not be from third parties.
Once a photo circulates on the Internet, it is almost impossible to remove it. Thus, the program can harm women worldwide. Those affected could lose their jobs and thus their livelihoods as a result of the photos.
“Basically, these deepfakes are either being used to fulfill a sick fantasy of a scorned lover, friend or pervert,” political analyst Nina Jankowicz told BuzzFeed News, “or they are being used as potential blackmail material.”
The distribution of deep nudes can also become criminally relevant. According to Section 63 of the German Data Protection Act, the dissemination of data deserving protection is punishable by imprisonment for up to one year. However, it is unclear whether this also applies to data that has been modified by artificial intelligence.
(t-online/dsc/bal)
Source: Blick

I am Ross William, a passionate and experienced news writer with more than four years of experience in the writing industry. I have been working as an author for 24 Instant News Reporters covering the Trending section. With a keen eye for detail, I am able to find stories that capture people’s interest and help them stay informed.