112
submitted 7 months ago by stopthatgirl7@kbin.social to c/news@lemmy.world

Permissive airstrikes on non-military targets and the use of an AI system have enabled the Israeli army to carry out its deadliest war on Gaza.

top 8 comments
sorted by: hot top controversial new old
[-] A_A@lemmy.world 10 points 7 months ago* (last edited 7 months ago)

Gaza is now an extermination camp.
Why are we so slow to catch on this ? I've been saying it since 3 weeks now.

We made the same mistake in 1945 : we could not believe the Germans would make such atrocious things.
Why are we so naive ?

[-] DarkGamer@kbin.social 6 points 7 months ago* (last edited 7 months ago)

Fascinating article, thanks!

another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. ...
A human eye “will go over the targets before each attack, but it need not spend a lot of time on them.” Since Israel estimates that there are approximately 30,000 Hamas members in Gaza, and they are all marked for death, the number of potential targets is enormous.
In 2019, the Israeli army created a new center aimed at using AI to accelerate target generation. “The Targets Administrative Division is a unit that includes hundreds of officers and soldiers, and is based on AI capabilities,” said former IDF Chief of Staff Aviv Kochavi in an in-depth interview with Ynet earlier this year.
“This is a machine that, with the help of AI, processes a lot of data better and faster than any human, and translates it into targets for attack,” Kochavi went on. “The result was that in Operation Guardian of the Walls [in 2021], from the moment this machine was activated, it generated 100 new targets every day. You see, in the past there were times in Gaza when we would create 50 targets per year. And here the machine produced 100 targets in one day.”
“We prepare the targets automatically and work according to a checklist,” one of the sources who worked in the new Targets Administrative Division told +972 and Local Call. “It really is like a factory. We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate.”
A senior military official in charge of the target bank told the Jerusalem Post earlier this year that, thanks to the army’s AI systems, for the first time the military can generate new targets at a faster rate than it attacks. Another source said the drive to automatically generate large numbers of targets is a realization of the Dahiya Doctrine.
Automated systems like Habsora have thus greatly facilitated the work of Israeli intelligence officers in making decisions during military operations, including calculating potential casualties. Five different sources confirmed that the number of civilians who may be killed in attacks on private residences is known in advance to Israeli intelligence, and appears clearly in the target file under the category of “collateral damage.”
According to these sources, there are degrees of collateral damage, according to which the army determines whether it is possible to attack a target inside a private residence. “When the general directive becomes ‘Collateral Damage 5,’ that means we are authorized to strike all targets that will kill five or less civilians — we can act on all target files that are five or less,” said one of the sources.
“In the past, we did not regularly mark the homes of junior Hamas members for bombing,” said a security official who participated in attacking targets during previous operations. “In my time, if the house I was working on was marked Collateral Damage 5, it would not always be approved [for attack].” Such approval, he said, would only be received if a senior Hamas commander was known to be living in the home.
“To my understanding, today they can mark all the houses of [any Hamas military operative regardless of rank],” the source continued. “That is a lot of houses. Hamas members who don’t really matter for anything live in homes across Gaza. So they mark the home and bomb the house and kill everyone there.”

This is the first I've heard of this being implemented. Are any other militaries using AI to generate targets?

I certainly hope that unlike many forms of AI they are able to see what criteria led to targets being selected, because often times this happens in a black box. Without this feature oversight and debugging becomes difficult if not impossible. Is the point ensuring that no human can be blamed if it goes wrong? This article certainly seems to be making the case that whatever human verification there is is insufficient and the standards for acceptable civilian casualties are lax.

It would be nice if some of their sources would go on the record if these accusations regarding target selection are true; I'd like to see the IDF respond to them and clarify its standards for selecting targets and what they consider acceptable collateral damage. Though, there are probably serious consequences to whistleblowing during wartime so I'm not holding my breath.

[-] TWeaK@lemm.ee 2 points 7 months ago

I think many other militaries have been developing such systems, but they haven't actively been deploying them, primarily because they're not at war. The only one who maybe might have been is Russia, but there hasn't been any coverage of them using systems like that.

[-] LeafyPasserine@kbin.social 1 points 7 months ago

Well, we'll find out about other militaries soon enough. Stock prices for weapon's manufacturers have been booming. The US and EU want a convenient weapon's testing ground and a canal to gas fields out of this.

The biggest loser is always innocent civilians at home and abroad.

[-] WidowsFavoriteSon@lemmy.world 4 points 7 months ago
[-] TheMightyKracken 10 points 7 months ago

It seems to be decent from a cursory glance. Were you being sarcastic? No offense, it's hard to tell sometimes.

[-] KoboldCoterie@pawb.social 14 points 7 months ago

Media Bias/Fact Check ranks it as highly factual and highly credible, and with a left-center bias. Seems legit to me.

[-] DarkGamer@kbin.social 9 points 7 months ago* (last edited 7 months ago)

More info about them, seems to be an Israeli magazine intended for international audiences formed as a collaboration between several bloggers opposed to occupation of Palestinian territories, many of whom are activists to that effect.

Edit: fixed link

this post was submitted on 30 Nov 2023
112 points (88.4% liked)

News

21752 readers
3405 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS