facebook
__CONFIG_widget_menu__{"menu_id":"866","dropdown_icon":"style_1","mobile_icon":"style_1","dir":"tve_horizontal","icon":{"top":"","sub":""},"layout":{"default":"grid"},"type":"regular","mega_desc":"e30=","images":[],"logo":false,"responsive_attributes":{"top":{"desktop":"text","tablet":"","mobile":""},"sub":{"desktop":"text","tablet":"","mobile":""}},"actions":[],"uuid":"m-181b8bae428","template":"39777","template_name":"Dropdown 01","unlinked":{".menu-item-16075":false,".menu-item-16081":false,".menu-item-16080":false,".menu-item-16079":false,".menu-item-16078":false,".menu-item-16077":false},"top_cls":{".menu-item-16075":"",".menu-item-16077":"","main":"",".menu-item-16081":"",".menu-item-16080":""},"tve_tpl_menu_meta":{"menu_layout_type":"Horizontal"},"tve_shortcode_rendered":1}__CONFIG_widget_menu__

Ethical implications of “The Gospel”, Israel’s AI warfare operation in Gaza

The Israel Defense Forces (IDF) is using artificial intelligence in their military operations against Hamas in the Gaza Strip, especially focusing on an AI target-creation platform called “the Gospel”. This platform has significantly accelerated the IDF’s ability to identify and strike targets. The IDF claims that these strikes are precise and minimize civilian casualties, but the article raises concerns about the reliability of these claims and the potential risks to civilians.

It details the IDF’s process of target selection, emphasizing the role of AI in rapidly generating targets for attack, and how this has increased the number of targets identified and attacked compared to previous conflicts. There is skepticism among experts about Israel’s claims of precision and reduced civilian harm. The IDF has identified over 12,000 targets in Gaza, with a large increase in the number of targets generated per day due to the AI system.

There are ethical concerns and legal implications of using AI in warfare, particularly regarding the potential for increased civilian casualties and the risk of over-reliance on automated systems in making life-and-death decisions. This approach may also attract international attention, as other countries might observe and learn from Israel’s use of AI in military operations.

The Guardian reports:

Israel’s military has made no secret of the intensity of its bombardment of the Gaza Strip. In the early days of the offensive, the head of its air force spoke of relentless, “around the clock” airstrikes. His forces, he said, were only striking military targets, but he added: “We are not being surgical.”

There has, however, been relatively little attention paid to the methods used by the Israel Defense Forces (IDF) to select targets in Gaza, and to the role artificial intelligence has played in their bombing campaign.

As Israel resumes its offensive after a seven-day ceasefire, there are mounting concerns about the IDF’s targeting approach in a war against Hamas that, according to the health ministry in Hamas-run Gaza, has so far killed more than 15,000 people in the territory.

The IDF has long burnished its reputation for technical prowess and has previously made bold but unverifiable claims about harnessing new technology. After the 11-day war in Gaza in May 2021, officials said Israel had fought its “first AI war” using machine learning and advanced computing.

The latest Israel-Hamas war has provided an unprecedented opportunity for the IDF to use such tools in a much wider theatre of operations and, in particular, to deploy an AI target-creation platform called “the Gospel”, which has significantly accelerated a lethal production line of targets that officials have compared to a “factory”.

The Guardian can reveal new details about the Gospel and its central role in Israel’s war in Gaza, using interviews with intelligence sources and little-noticed statements made by the IDF and retired officials.

Read the full article.

About Post Author


Related Daily News

>