Deceptively genuine fakes: FBI warns against deepfake blackmail videos

class=”sc-29f61514-0 kHgAwW”>

1/2
It may not even have happened in reality: AI techniques also make it possible to create erotic content that never existed.

The capabilities of generative artificial intelligence (AI) range from impressive to alarming. Generative AI includes every type of text, image and video created with it. Images created with generative AI can appear so lifelike that the FBI is now warning about deepfakes, which criminals use to blackmail their victims. No one is immune to such deepfake techniques.

“The FBI continues to receive reports of victims, including underage children and non-conforming adults, whose photos or videos have been turned into explicit content,” the agency wrote in a warning to the American public on Monday — affecting not only people in the United States.

Deepfakes are increasingly common video or audio content created with AI that simulate fake events. Thanks to AI platforms such as Midjourney, Dalle-e and OpenAI, it is becoming increasingly difficult to identify such deepfakes as fake.

No more science fiction

Accordingly, reports of online blackmail, so-called “sextortion scams,” which the FBI says primarily target minors, are on the rise. There are no limits to deepfakes. In May, a deepfake of Tesla and Twitter CEO Elon Musk (51) went viral. The video shared on social media contained footage of Musk from previous interviews, edited to fit the scam.

The FBI warns against paying the ransom because it is no guarantee that the criminals will not release the deepfake anyway. You are also advised to exercise extreme caution when sharing personal information and content on the Internet.

“Artificial intelligence is no longer a far-fetched sci-fi movie idea,” the U.S. Bureau of Consumer Protection (FTC) warned in March. “We live with her, here and now. A scammer can use artificial intelligence to clone a loved one’s voice.” Criminals only need a short audio clip of a family member’s voice to make the recording sound real, like a message asking for money. (kes)

Source: Blick

follow:
Amelia

Amelia

I am Amelia James, a passionate journalist with a deep-rooted interest in current affairs. I have more than five years of experience in the media industry, working both as an author and editor for 24 Instant News. My main focus lies in international news, particularly regional conflicts and political issues around the world.

Related Posts

Hot News

Trending

Subscribe

Lorem ipsum dolor sit amet, consectetur adipiscing elit.