The incredibly brave and resourceful Israeli journalist Yuval Abraham revealed Wednesday in a hard-hitting piece of investigative journalism that the Israeli military has used two artificial intelligence programs, “Lavender” and “Where’s Daddy,” to target some 37,000 alleged members of the military wings of Hamas and Islamic Jihad.

The programs used GPS to discover when a Hamas member had gone home, since it was easiest to hit them there, ensuring that his wife and children would also be killed. If he lived in an apartment building, which most did, then all the civilians in neighboring apartments could also be killed–children, women, non-combatant men.

Science fiction writer Martha Wells has authored a series of novels and short stories about a “Murderbot,” an artificial intelligence in the body of an armored warrior. Her Murderbot, despite being lethal, is a good guy, and in noir style frees himself from the control of his corporate overlords to protect his friends.

The Israeli army, in contrast, is acting much more robotically.

I hope the International Court of Justice, which is considering whether Israel is committing a genocide, is reading +972 Mag.

Lavender is just a program and doesn’t have a body attached, but uses Israeli fighter jet pilots as an extension of itself.

The AI programs identified the Hamas militants according to vague specifications. It is known to have a 10% error rate and in other cases the supposed militant might have only loose connections to the Qassam Brigades paramilitary or the IJ. There was, Abraham writes, almost no human supervision over the working of the algorithm.

AI Lavender, at a 10% error rate, could have identified 3,700 men in Gaza as Hamas guerrillas when they weren’t. It could have allowed as many as 20 civilians to be killed in each strike on each of these innocents, That would give a total of 77,700 noncombatants blown arbitrarily away by an inaccurate machine.

One of Abraham’s sources inside the Israeli army said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

I hope the International Court of Justice, which is considering whether Israel is committing a genocide, is reading +972 Mag.

The AI program included extremely loose rules of engagement on civilian casualties. It was set to permit 10-20 civilians to be killed as part of a strike on a low-level Hamas member, and up to 100 civilians could be killed to get at a senior member. These new rules of engagement are unprecedented even in the brutal Israeli army.

The “Where’s Daddy” program identified and tracked the members.

A full 37,000 Hamas paramilitary fighters did not carry out October 7. Most of them did not know about it beforehand. It was a tiny, tight clique that planned and executed it. The civilian wing of Hamas was the elected government of Gaza, and its security forces provided law and order (refugee camps are most often lawless). It may be that Lavender and “Where’s Daddy” swept up ordinary police in the definition of low-level Hamas fighters, which would explain a lot.

This new video game way of war violates the Rules of Engagement of the U.S. military and all the precepts of International Humanitarian Law. The Marine Corps Rules of Engagement say,

None of the Israeli “soldiers” operating Lavender were in danger from the civilians they killed. They made no effort to “minimize collateral damage.” In fact, they built very substantial collateral damage into their standard operating procedure.

If the Israeli military killed an average of 20 civilians each time they struck one of the 37,000 alleged militants, that would be 740,000 deaths, or three-quarters of a million. Of babies, toddlers, pregnant mothers, unarmed women, unarmed teenagers, etc., etc. That would be about a third of the total Gaza population.

That is certainly a genocide, however you wish to define the term.

And there is no way that Joe Biden and Antony Blinken haven’t known all this all along. It is on them.

The incredibly brave and resourceful Israeli journalist Yuval Abraham revealed Wednesday in a hard-hitting piece of investigative journalism that the Israeli military has used two artificial intelligence programs, “Lavender” and “Where’s Daddy,” to target some 37,000 alleged members of the military wings of Hamas and Islamic Jihad.

The programs used GPS to discover when a Hamas member had gone home, since it was easiest to hit them there, ensuring that his wife and children would also be killed. If he lived in an apartment building, which most did, then all the civilians in neighboring apartments could also be killed–children, women, non-combatant men.

Science fiction writer Martha Wells has authored a series of novels and short stories about a “Murderbot,” an artificial intelligence in the body of an armored warrior. Her Murderbot, despite being lethal, is a good guy, and in noir style frees himself from the control of his corporate overlords to protect his friends.

The Israeli army, in contrast, is acting much more robotically.

I hope the International Court of Justice, which is considering whether Israel is committing a genocide, is reading +972 Mag.

Lavender is just a program and doesn’t have a body attached, but uses Israeli fighter jet pilots as an extension of itself.

The AI programs identified the Hamas militants according to vague specifications. It is known to have a 10% error rate and in other cases the supposed militant might have only loose connections to the Qassam Brigades paramilitary or the IJ. There was, Abraham writes, almost no human supervision over the working of the algorithm.

AI Lavender, at a 10% error rate, could have identified 3,700 men in Gaza as Hamas guerrillas when they weren’t. It could have allowed as many as 20 civilians to be killed in each strike on each of these innocents, That would give a total of 77,700 noncombatants blown arbitrarily away by an inaccurate machine.

One of Abraham’s sources inside the Israeli army said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

I hope the International Court of Justice, which is considering whether Israel is committing a genocide, is reading +972 Mag.

The AI program included extremely loose rules of engagement on civilian casualties. It was set to permit 10-20 civilians to be killed as part of a strike on a low-level Hamas member, and up to 100 civilians could be killed to get at a senior member. These new rules of engagement are unprecedented even in the brutal Israeli army.

The “Where’s Daddy” program identified and tracked the members.

A full 37,000 Hamas paramilitary fighters did not carry out October 7. Most of them did not know about it beforehand. It was a tiny, tight clique that planned and executed it. The civilian wing of Hamas was the elected government of Gaza, and its security forces provided law and order (refugee camps are most often lawless). It may be that Lavender and “Where’s Daddy” swept up ordinary police in the definition of low-level Hamas fighters, which would explain a lot.

This new video game way of war violates the Rules of Engagement of the U.S. military and all the precepts of International Humanitarian Law. The Marine Corps Rules of Engagement say,

None of the Israeli “soldiers” operating Lavender were in danger from the civilians they killed. They made no effort to “minimize collateral damage.” In fact, they built very substantial collateral damage into their standard operating procedure.

If the Israeli military killed an average of 20 civilians each time they struck one of the 37,000 alleged militants, that would be 740,000 deaths, or three-quarters of a million. Of babies, toddlers, pregnant mothers, unarmed women, unarmed teenagers, etc., etc. That would be about a third of the total Gaza population.

That is certainly a genocide, however you wish to define the term.

And there is no way that Joe Biden and Antony Blinken haven’t known all this all along. It is on them.

QOSHE - Israel’s Genocidal New Video Game Way of War - Juan Cole
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Israel’s Genocidal New Video Game Way of War

12 0
04.04.2024

The incredibly brave and resourceful Israeli journalist Yuval Abraham revealed Wednesday in a hard-hitting piece of investigative journalism that the Israeli military has used two artificial intelligence programs, “Lavender” and “Where’s Daddy,” to target some 37,000 alleged members of the military wings of Hamas and Islamic Jihad.

The programs used GPS to discover when a Hamas member had gone home, since it was easiest to hit them there, ensuring that his wife and children would also be killed. If he lived in an apartment building, which most did, then all the civilians in neighboring apartments could also be killed–children, women, non-combatant men.

Science fiction writer Martha Wells has authored a series of novels and short stories about a “Murderbot,” an artificial intelligence in the body of an armored warrior. Her Murderbot, despite being lethal, is a good guy, and in noir style frees himself from the control of his corporate overlords to protect his friends.

The Israeli army, in contrast, is acting much more robotically.

I hope the International Court of Justice, which is considering whether Israel is committing a genocide, is reading 972 Mag.

Lavender is just a program and doesn’t have a body attached, but uses Israeli fighter jet pilots as an extension of itself.

The AI programs identified the Hamas militants according to vague specifications. It is known to have a 10% error rate and in other cases the supposed militant might have only loose connections to the Qassam Brigades paramilitary or the IJ. There was, Abraham writes, almost no human supervision over the working of the algorithm.

AI Lavender, at a 10% error rate, could have identified 3,700 men in Gaza as Hamas guerrillas when they weren’t. It could have allowed as many as 20 civilians to be killed in each strike on each of these innocents, That would give a total of 77,700 noncombatants blown arbitrarily away by an inaccurate machine.

One of Abraham’s sources inside the Israeli army said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an........

© Common Dreams


Get it on Google Play