"This is the first confirmation we have gotten that commercial AI models are directly being used in warfare," Heidy Khlaaf, chief artificial intelligence scientist at the
AI Now Institute and a former senior safety engineer at OpenAI, which makes ChatGPT, told the AP. "The implications are enormous for the role of tech in enabling this type of unethical and unlawful warfare going forward."
As Biesecker, Mednick, and Burke noted:
Israel's goal after the attack that killed about 1,200 people and took over 250 hostages was to eradicate Hamas, and its military has called AI a "game changer" in yielding targets more swiftly. Since the war started, more than 50,000 people have died in Gaza and Lebanon and nearly 70% of the buildings in Gaza have been devastated, according to health ministries in Gaza and Lebanon.
According to the
AP report, Israel buys advanced AI models from OpenAI and Microsoft's Azure cloud platform. While OpenAI said it has no partnership with the Israel Defense Forces (IDF), in early 2024 the company quietly removed language from its usage policy that prohibited military use of its technology.
The
AP reporters also found that Google and Amazon provide cloud computing and AI services to the IDF via Project Nimbus, a $1.2 billion contract signed in 2021. Furthermore, the IDF uses Cisco and Dell server farms or data centers. Red Hat, an independent IBM subsidiary, sells cloud computing services to the IDF. Microsoft partner Palantir Technologies also has a "strategic partnership" with Israel's military.
Google told the
AP that the company is committed to creating AI "that protects people, promotes global growth, and supports national security."
However, Google recently removed from its Responsible AI principles a commitment to not use AI for the development of technology that could cause "overall harm," including weapons and surveillance.
The
AP investigation follows a Washington Post probe published last month detailing how Google has been "directly assisting" the IDF and Israel's Ministry of Defense "despite the company's efforts to publicly distance itself from the country's national security apparatus after employee protests against a cloud computing contract with Israel's government."
Google
fired dozens of workers following their participation in "No Tech for Apartheid" protests against the use of the company's products and services by forces accused of genocide in Gaza.
"A Google employee warned in one document that if the company didn't quickly provide more access, the military would turn instead to Google's cloud rival Amazon, which also works with Israel's government under the Nimbus contract," wrote Gerrit De Vynck, author of the
Post report.
"As recently as November 2024, by which time a year of Israeli airstrikes had turned much of Gaza to
rubble, documents show Israel's military was still tapping Google for its latest AI technology," De Vynck added. "Late that month, an employee requested access to the company's Gemini AI technology for the IDF, which wanted to develop its own AI assistant to process documents and audio, according to the documents."
Previous investigations have detailed how the IDF also uses Habsora, an Israeli AI system that can automatically select airstrike targets at an exponentially faster rate than ever before.
"In the past, there were times in Gaza when we would create 50 targets per year. And here the machine produced 100 targets in one day," former IDF Chief of Staff Aviv Kochavi told Yuval Abraham of +972 Magazine, a joint Israeli-Palestinian publication, in 2023. Another intelligence source said that Habsora has transformed the IDF into a "mass assassination factory" in which the "emphasis is on quantity and not quality" of kills.
Compounding the crisis, in the heated hours following the October 7 attack, mid-ranking IDF officers were empowered to order attacks on not only senior Hamas commanders but any fighter in the resistance group, no matter how junior. What's more, the officers were allowed to risk up to 20 civilian lives in each strike, and up to 500 noncombatant lives per day. Days later, that limit was lifted. Officers could order any number of strikes as they believed were legal, with no limits on civilian harm.
Senior IDF commanders sometimes approved strikes they knew could kill more than 100 civilians if the target was deemed important enough. In one AI-aided airstrike
targeting one senior Hamas commander, the IDF dropped multiple U.S.-supplied 2,000-pound bombs, which can level an entire city block, on the Jabalia refugee camp in October 2023. According to the U.K.-based airstrike monitor Airwars, the bombing killed at least 126 people, 68 of them children, and wounded 280 others. Hamas' Qassam Brigades said four Israeli and three international hostages were also killed in the attack.
Then there's the mass surveillance element. Independent journalist Antony Loewenstein recently
wrote for Middle East Eye that "corporate behemoths are storing massive amounts of information about every aspect of Palestinian life in Gaza, the occupied West Bank, and elsewhere."
"How this data will be used, in a time of war and mass surveillance, is obvious," Loewenstein continued. "Israel is building a huge database,
Chinese-state style, on every Palestinian under occupation: what they do, where they go, who they see, what they like, what they want, what they fear, and what they post online."
"Palestinians are guinea pigs—but this ideology and work doesn't stay in Palestine," he said. "Silicon Valley has taken note, and the new Trump era is heralding an
ever-tighter alliance among Big Tech, Israel, and the defense sector. There's money to be made, as AI currently operates in a regulation-free zone globally."
"Think about how many other states, both democratic and dictatorial, would love to have such extensive information about every citizen, making it far easier to target critics, dissidents, and opponents," Loewenstein added. "With the
far right on the march globally—from Austria to Sweden, France to Germany, and the U.S. to Britain—Israel's ethno-nationalist model is seen as attractive and worth mimicking.