Since the end of the ceasefire, Israel has again attacked suspected Hamas homes and tunnels in the Gaza Strip. What was little known until now: how Israel chooses its targets. On the one hand, the army and secret services are of course listening to the radio communications of the terrorists, and on the other hand there are spies reporting suspicious movements. But another instrument is of course much more important: the “Gospel”.
This is a system with the Hebrew name ‘Habsora’, or gospel in English, which suggests possible goals. It is controlled by artificial intelligence, and at breathtaking speed, as the British Guardian discovered in extensive research together with the magazine “+972” and the newspaper “Local Call”. In early November, the Israeli army announced that it had identified more than 12,000 attack targets in the Gaza Strip. The Gospel has probably played an important role in this: where previously around 50 targets could be identified in the Gaza Strip per year, thanks to AI there are now 100 targets in just one day.
The Israeli Defense Forces (IDF) says it has attacked hundreds of targets in the Gaza Strip since the ceasefire ended on Friday. The gospel may have helped again. According to the Guardian, the IDF explained that the system creates recommendations “through the rapid and automatic extraction of information,” “with the aim of achieving full agreement between the machine’s recommendation and the identification performed by an individual.”
The AI is said to have built up a database of 30,000 to 40,000 people in recent years who are considered possible opponents of Israel.
Aviv Kochavi, who headed Israel’s IDF until January, had already praised the AI systems before the Hamas attack. It is “a machine that produces vast amounts of data more efficiently than any human and turns it into attack targets.” When it was activated two years ago, it could spit out up to 100 targets per day. Half of them are also under attack.
“The IDF in 2023 is not only different from the IDF in ’82 or ’73, but also from ten years ago. “Each brigade now has a sophisticated intelligence apparatus reminiscent of the movie ‘The Matrix’ that provides information in real time,” he said. Of all the technological revolutions, “artificial intelligence will probably be the most radical, for better or for worse,” Kochavi said.
Where the AI gets its information from is a secret. According to the Guardian, experts assume that drone recordings, intercepted conversations and electronic communications and other data sources form the basis. From this, patterns are formed about how and where suspects may be located.
The AI targets must be selected with great care, Israeli media report. The newspaper “Yedioth Ahronoth” reported that this would prevent civilian casualties. A senior Israeli military source told the Guardian that the military uses a “very precise” measure of the number of civilians leaving a building shortly before an attack. “We use an algorithm to determine how many citizens are left. It shows us green, yellow, red, like a traffic light.”
However, the high number of reported civilian casualties from Gaza raises doubts. Even taking into account that it is the Hamas-controlled health authority that provided the figures, they are still very high. Even Prime Minister Benjamin Netanyahu admitted this.
“Look at the physical landscape of the Gaza Strip,” Richard Moyes, a researcher who heads Article 36, a group dedicated to reducing weapons damage, told the Guardian.
“We are seeing a large part of an urban area being leveled with high explosives, so the claim of precise and limited use of force is not supported by the facts.”
According to the IDF, 15,000 targets were defined in the first 35 days of the war and attacks followed accordingly. According to information from the Guardian and Israeli journalists from +972 and Local Call, once a target has been selected, the army knows how many civilian casualties there will be.
Sources familiar with AI-based systems say such tools have significantly accelerated the goal creation process.
“We prepare the goals automatically and work according to a checklist,” said a source who previously worked in the goals department on ‘+972’ and ‘Local Call’. “It really is like being in a factory. We work quickly and there is no time to dive deep into the goal. We will be judged by the number of goals we can achieve.”
Richard Moyes therefore warns against the use of AI systems. “The danger is,” he said, “that people who rely on these systems will become cogs in a mechanized process and lose the ability to meaningfully consider the risk of civilian harm.”
Used sources:
Soource :Watson
I am Amelia James, a passionate journalist with a deep-rooted interest in current affairs. I have more than five years of experience in the media industry, working both as an author and editor for 24 Instant News. My main focus lies in international news, particularly regional conflicts and political issues around the world.
On the same day of the terrorist attack on the Krokus City Hall in Moscow,…
class="sc-cffd1e67-0 iQNQmc">1/4Residents of Tenerife have had enough of noisy and dirty tourists.It's too loud, the…
class="sc-cffd1e67-0 iQNQmc">1/7Packing his things in Munich in the summer: Thomas Tuchel.After just over a year,…
At least seven people have been killed and 57 injured in severe earthquakes in the…
The American space agency NASA would establish a uniform lunar time on behalf of the…
class="sc-cffd1e67-0 iQNQmc">1/8Bode Obwegeser was surprised by the earthquake while he was sleeping. “It was a…