30.6 C
New York

An Israeli officer has disclosed the utilization of an AI tool during the 2021 Gaza offensive.

Published:

Amid recent revelations regarding the utilization of an artificial intelligence-driven tool named Lavender by the Israeli military to pinpoint bombing targets in Gaza, attention has been drawn to a year-old video surfacing on social media. In this footage, an official discusses the application of machine learning for target identification.

Despite the Israeli Defence Forces (IDF) refuting claims of AI involvement in identifying suspected terrorists, an official from Israel’s cyber intelligence agency elaborated on employing machine learning techniques during the 2021 offensive in Gaza, as reported by The Guardian.

Referring to one such tool, identified as ‘Colonel Yoav’, the official illustrated, “Let’s say we have some terrorists that form a group, and we know only some of them… By practicing our data science magic powder, we are able to find the rest of them.”

The video was captured at a conference held at Tel Aviv University in February 2023, where attendees were instructed against photographing or recording the presentation by the official.

Part of Unit 8200, the official explained how machine learning was utilized to locate Hamas squad missile commanders and anti-tank missile terrorists in Gaza during the IDF’s military operation in May 2021.

“We take the original sub-group, we calculate their close circles, we then calculate relevant features, and at last we rank the results and determine the threshold,” The Guardian cited the intel official.

While feedback from intelligence officers contributed to enriching and refining the algorithm, the official emphasized that “people of flesh and blood” ultimately make the decisions. “These tools are meant to help break the human barrier,” added the Israeli official.

The intel officer highlighted the unit’s achievement in identifying over 200 new targets. He underscored the benefits of the AI tool, stating, “Suddenly you can react during battle with applied data-science-driven solutions.”

The description provided by the colonel aligns with recent disclosures by six Israeli intelligence officials to +972 Magazine and a Hebrew-language media outlet. These officials revealed the utilization of an AI-based tool called “Lavender,” which reportedly had a 10% error rate and assisted intelligence officers during the bombing campaign in Gaza, identifying tens of thousands of potential human targets.

Related articles

spot_img

Recent articles

spot_img