97889 64456 72312 47532 85224 72311 99738 05314 18822 88877 83701 91188 72106 98803 83485 70762 67561 00923 55229 06479 57972 59061 74949 93171 14807 03728 86417 14924 55271 76483 09709 80826 48003 69756 41326 33857 90179 16007 50123 74390 32549 30315 44217 63317 75601 80709 41762 62320 18455 61834 28274 17965 11564 40730 97515 38882 00045 18375 34435 87730 65633 86354 42635 03181 37624 00288 29224 98754 64198 42645 13159 80277 57942 84214 09885 11406 37363 27238 16160 82824 82750 03902 45252 98749 86602 85405 74120 11069 70749 63642 54482 33973 81058 25338 11638 53184 38067 75862 58160 05931 81160 94118 63131 11678 37627 13358 15577 41533 20376 02073 54475 97260 40548 91470 84971 47067 00960 20371 54295 32383 70544 08125 72446 96640 07075 16165 30869 08344 20223 85830 11652 84248 58240 18720 83640 74865 63798 26432 11368 91553 98930 40390 63732 07578 52004 83379 91665 87295 27594 70342 33614 00445 56766 74846 32119 67664 51801 34739 44392 32414 80290 43295 50949 32938 59188 82226 64963 12065 07486 96473 17151 41690 05059 80565 72757 89563 68610 87113 78719 74762 26213 13426 23716 54025 70952 73308 30338 98371 80443 39662 15506 33308 53719 47268 57523 71539 98084 43052 68615 92226 35372 86296 82533 08533 12606 77475 19780 50069 42332 94775 84463 97795 86712 89454 36026 27730 87899 25252 69813 38682 Israel uses the Lavender artificial intelligence system in Gaza: 20 seconds of life or death - BABY-ACE

The Israeli military relies on AI to search for Hamas fighters. It acts almost autonomously, accepting civilian casualties and suspecting tens of thousands.

Satellite image shows destruction of area around Al-Shifa hospital in Gaza

Total destruction and civilian casualties are accepted. Photo: Maxar Technologies/Reuters

Current research illustrates the dystopia of war in the Gaza Strip. According to online Israeli-Palestinian media +972 magazines The Israeli military uses an artificial intelligence program called “Lavender” to find Hamas targets. In total, the AI ​​labeled at least 37,000 people as “terrorists” and approved their execution.

According to information from whistleblowers, civilian casualties are knowingly accepted. For low-ranking Hamas fighters, 15 to 20 civilian casualties are tolerable. High-ranking commanders even tolerate the deaths of more than 100 civilians.

Equally shocking is the fact that apartment blocks where families live are being deliberately bombed because, according to the military, it is easier to capture targets if they are with their families.

One source reports that they spend about 20 seconds checking the AI's rating. During this time she simply checks to see if the target is a man.

“Lavender” acts almost completely autonomously. One source reports that for a human that the AI ​​classifies as a suspect, it takes about 20 seconds to verify the classification. During this time she simply checks to see if the target is a man.

Other findings, quite striking, recall other conflicts, such as the American “War on Terror” in Afghanistan and elsewhere. In 2012 it became known that Washington considers any “male of military age” in the context of a drone attack to be an “enemy combatant” per se, unless proven otherwise. “Lavender” is not limited to Palestinian minors either.

In fact, mass killings using AI are a direct result of the “push-button killing” established in the wake of the counterterrorism wars of the past two decades. Any understanding of ethics, morality and the rule of law is lost in this type of war.

Furthermore, it is not successful, generates new terrorists and is unreliable. The disastrous outcome of the war in Afghanistan also makes this clear: many of the Taliban leaders who were killed in recent years after supposedly precise drone operations now rule in Kabul and are stronger than ever.

302 Found

302

Found

The document has been temporarily moved.