The Israeli military has reportedly utilized artificial intelligence (AI) to identify bombing targets in Gaza, as per an investigation by +972 Magazine and Local Call.
Six Israeli intelligence officials, involved in the alleged program, claimed that the human review of suggested targets was minimal.
The AI tool, dubbed "Lavender," was said to have a 10% error rate.
Israel Defence Forces (IDF), when questioned about the report, did not deny the existence of the tool but refuted claims of using AI to identify suspected terrorists.
The IDF emphasized that information systems are tools for analysts and that efforts are made to minimize harm to civilians.
However, according to +972 Magazine, human personnel often served merely as a "rubber stamp" for the machine's decisions, dedicating only around 20 seconds to each target before authorizing a bombing.
This investigation coincides with heightened international scrutiny of Israel's military operations, particularly following airstrikes that killed foreign aid workers in Gaza.
The Gaza Ministry of Health reported more than 32,916 deaths due to Israel's siege, leading to a severe humanitarian crisis.
The IDF stated that it does not use AI to identify terrorists but rather relies on a database to cross-reference intelligence sources.
Human officers are responsible for verifying targets in accordance with international law.
The magazine also reported that the Israeli army frequently targeted individuals in their homes, often resulting in civilian casualties.
The IDF claims to choose munitions carefully to minimize collateral damage and reduce harm to civilians.
Israeli officials argue that heavy munitions are necessary to combat Hamas, responsible for numerous casualties and hostage-taking incidents in Israel.