“It’s a tragic horror story that shook the entire nation,” says a childishly high-pitched but unnatural voice. A girl’s head moves slightly up and down to give the illusion of looking at a real person. Then the voice continues: “My name is Becky Watts. I was born in Bristol, England in 1998 as the eldest of five siblings. My childhood, however, was not idyllic, but was marked by neglect and abuse from the very people who were supposed to love and protect me. When I was 16, my life turned into a real nightmare. My stepbrother Nathan Matthews and his partner Shauna Hoare started their terror campaign: they stole my belongings and locked me in my room for hours.”
Hundreds of videos on TikTok start in this style. Children generated by an AI report about their own murder or kidnapping. In the example above, Becky Watts was murdered by her own stepbrother in 2016, and the story is largely true. Smaller details are partly deviated from the truth or embellished. For example, there is no record of her being the eldest of five children.
At the end of the videos, the kids say thank you for listening and ask you or the TikTok channel to follow them. The videos are usually accompanied by epic or dramatic music.
@mystorymatters99 ♬ Emotional cinematic sad violin and piano
Hundreds of videos of children who have been victims of violent crimes can be found on TikTok under the hashtag #animatedhistory. There are also some videos of historical figures, such as Tupac, JFK or the Queen, telling the story of their lives as AI characters.
Paul Bleakley, assistant professor of criminal justice at New Haven University, commented on the True Crime videos for Rolling Stones.
According to Bleakley, such videos can be difficult for victims and victims’ relatives to process. “There is a chance that victims will be traumatized again,” warns Bleakley. “Imagine being a parent or relative of one of the children in these AI videos. You go online and suddenly see an AI-generated image modeled after your deceased child, accompanied by a strange high-pitched voice describing what happened to their child. It can be very disturbing.”
Bleakley also noted that the AI videos don’t have to stop at talking portraits. It would also be conceivable that true crime fans could use artificial intelligence to recreate entire crime scenes.
Otherwise, the videos are not entirely unproblematic. According to lawyer and expert on law in the digital space, Martin Steiger, deepfakes of people who are still alive can be prosecuted under civil law. As a private individual you have privacy protection. However, it expires after death. If the maker of the deepfakes cannot be found, the distributors, in this case TikTok, are also liable.
However, deceased persons may still be a criminal offense for defamation. The next of kin can then file a report. If there is no violation of honor, the next of kin of the deceased are powerless.
Lawyer Martin Steiger also confirmed that there are no special rules regarding deepfakes in Switzerland. However, general criminal offenses apply to depicting prohibited violence or hard pornography.
Source: Watson
I’m Ella Sammie, author specializing in the Technology sector. I have been writing for 24 Instatnt News since 2020, and am passionate about staying up to date with the latest developments in this ever-changing industry.
On the same day of the terrorist attack on the Krokus City Hall in Moscow,…
class="sc-cffd1e67-0 iQNQmc">1/4Residents of Tenerife have had enough of noisy and dirty tourists.It's too loud, the…
class="sc-cffd1e67-0 iQNQmc">1/7Packing his things in Munich in the summer: Thomas Tuchel.After just over a year,…
At least seven people have been killed and 57 injured in severe earthquakes in the…
The American space agency NASA would establish a uniform lunar time on behalf of the…
class="sc-cffd1e67-0 iQNQmc">1/8Bode Obwegeser was surprised by the earthquake while he was sleeping. “It was a…