Categories: World

Deceptively genuine fakes: FBI warns against deepfake blackmail videos

class=”sc-29f61514-0 kHgAwW”>

1/2
It may not even have happened in reality: AI techniques also make it possible to create erotic content that never existed.

The capabilities of generative artificial intelligence (AI) range from impressive to alarming. Generative AI includes every type of text, image and video created with it. Images created with generative AI can appear so lifelike that the FBI is now warning about deepfakes, which criminals use to blackmail their victims. No one is immune to such deepfake techniques.

“The FBI continues to receive reports of victims, including underage children and non-conforming adults, whose photos or videos have been turned into explicit content,” the agency wrote in a warning to the American public on Monday — affecting not only people in the United States.

Deepfakes are increasingly common video or audio content created with AI that simulate fake events. Thanks to AI platforms such as Midjourney, Dalle-e and OpenAI, it is becoming increasingly difficult to identify such deepfakes as fake.

No more science fiction

Accordingly, reports of online blackmail, so-called “sextortion scams,” which the FBI says primarily target minors, are on the rise. There are no limits to deepfakes. In May, a deepfake of Tesla and Twitter CEO Elon Musk (51) went viral. The video shared on social media contained footage of Musk from previous interviews, edited to fit the scam.

The FBI warns against paying the ransom because it is no guarantee that the criminals will not release the deepfake anyway. You are also advised to exercise extreme caution when sharing personal information and content on the Internet.

“Artificial intelligence is no longer a far-fetched sci-fi movie idea,” the U.S. Bureau of Consumer Protection (FTC) warned in March. “We live with her, here and now. A scammer can use artificial intelligence to clone a loved one’s voice.” Criminals only need a short audio clip of a family member’s voice to make the recording sound real, like a message asking for money. (kes)

Source: Blick

Share
Published by
Amelia

Recent Posts

Terror suspect Chechen ‘hanged himself’ in Russian custody Egyptian President al-Sisi has been sworn in for a third term

On the same day of the terrorist attack on the Krokus City Hall in Moscow,…

1 year ago

Locals demand tourist tax for Tenerife: “Like a cancer consuming the island”

class="sc-cffd1e67-0 iQNQmc">1/4Residents of Tenerife have had enough of noisy and dirty tourists.It's too loud, the…

1 year ago

Agreement reached: this is how much Tuchel will receive for his departure from Bayern

class="sc-cffd1e67-0 iQNQmc">1/7Packing his things in Munich in the summer: Thomas Tuchel.After just over a year,…

1 year ago

Worst earthquake in 25 years in Taiwan +++ Number of deaths increased Is Russia running out of tanks? Now ‘Chinese coffins’ are used

At least seven people have been killed and 57 injured in severe earthquakes in the…

1 year ago

Now the moon should also have its own time (and its own clocks). These 11 photos and videos show just how intense the Taiwan earthquake was

The American space agency NASA would establish a uniform lunar time on behalf of the…

1 year ago

This is how the Swiss experienced the earthquake in Taiwan: “I saw a crack in the wall”

class="sc-cffd1e67-0 iQNQmc">1/8Bode Obwegeser was surprised by the earthquake while he was sleeping. “It was a…

1 year ago