SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
");background-position:center;background-size:19px 19px;background-repeat:no-repeat;background-color:var(--button-bg-color);padding:0;width:var(--form-elem-height);height:var(--form-elem-height);font-size:0;}:is(.js-newsletter-wrapper, .newsletter_bar.newsletter-wrapper) .widget__body:has(.response:not(:empty)) :is(.widget__headline, .widget__subheadline, #mc_embed_signup .mc-field-group, #mc_embed_signup input[type="submit"]){display:none;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) #mce-responses:has(.response:not(:empty)){grid-row:1 / -1;grid-column:1 / -1;}.newsletter-wrapper .widget__body > .snark-line:has(.response:not(:empty)){grid-column:1 / -1;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) :is(.newsletter-campaign:has(.response:not(:empty)), .newsletter-and-social:has(.response:not(:empty))){width:100%;}.newsletter-wrapper .newsletter_bar_col{display:flex;flex-wrap:wrap;justify-content:center;align-items:center;gap:8px 20px;margin:0 auto;}.newsletter-wrapper .newsletter_bar_col .text-element{display:flex;color:var(--shares-color);margin:0 !important;font-weight:400 !important;font-size:16px !important;}.newsletter-wrapper .newsletter_bar_col .whitebar_social{display:flex;gap:12px;width:auto;}.newsletter-wrapper .newsletter_bar_col a{margin:0;background-color:#0000;padding:0;width:32px;height:32px;}.newsletter-wrapper .social_icon:after{display:none;}.newsletter-wrapper .widget article:before, .newsletter-wrapper .widget article:after{display:none;}#sFollow_Block_0_0_1_0_0_0_1{margin:0;}.donation_banner{position:relative;background:#000;}.donation_banner .posts-custom *, .donation_banner .posts-custom :after, .donation_banner .posts-custom :before{margin:0;}.donation_banner .posts-custom .widget{position:absolute;inset:0;}.donation_banner__wrapper{position:relative;z-index:2;pointer-events:none;}.donation_banner .donate_btn{position:relative;z-index:2;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_0{color:#fff;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_1{font-weight:normal;}.grey_newsblock .newsletter-wrapper, .newsletter-wrapper, .newsletter-wrapper.sidebar{background:linear-gradient(91deg, #005dc7 28%, #1d63b2 65%, #0353ae 85%);}
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join," said a Palestinian digital rights group.
The Palestinian digital rights group Sada Social on Saturday called for an investigation into Israel's alleged use of WhatsApp user data to target Palestinians with its AI system, Lavender.
The group, which is affiliated with the Al Jazeera Media Institute and Access Now, accused Meta, which owns WhatsApp, of fueling "the 'Lavender' artificial intelligence system used by the Israeli military to kill Palestinian individuals within the Gaza enclave."
As Common Dreamsreported in April, the Israel Defense Forces has relied on AI systems including Lavender to target people Israel believes to be Hamas members.
At +972 Magazine, Israeli journalist Yuval Abraham wrote that a current commander of an elite Israeli intelligence unit pushed for the use of AI to choose targets in Gaza. The commander wrote in a guide book to create the system that "hundreds and thousands" of features can be used to select targets, "such as being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently."
Sada Social asserted that it had found the Lavender system uses WhatsApp data to select targets.
"The reports monitored by the Sada Social Center indicate that one of the inputs to the 'Lavender' system relies on data collected from WhatsApp groups containing names of Palestinians or activists who are wanted by 'Israel,'" said the group in a press release. "The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
The mention of Israel's use of WhatsApp data in Abraham's reporting also caught the attention last month of Paul Biggar, founder of Tech for Palestine.
"There's a lot wrong with this—I'm in plenty of WhatsApp groups with strangers, neighbors, and in the carnage in Gaza you bet people are making groups to connect," wrote Biggar. "But the part I want to focus on is whether they get this information from Meta. Meta has been promoting WhatsApp as a 'private' social network, including 'end-to-end' encryption of messages."
"Providing this data as input for Lavender undermines their claim that WhatsApp is a private messaging app," he wrote. "It is beyond obscene and makes Meta complicit in Israel's killings of 'pre-crime' targets and their families, in violation of international humanitarian law and Meta's publicly stated commitment to human rights. No social network should be providing this sort of information about its users to countries engaging in 'pre-crime.'"
Others have pointed out that Israel may have acquired WhatsApp data through means other than a leak by Meta.
Journalist Marc Owen Jones said the question of "Meta's potential role in this is important," but noted that informants, captured devices, and spyware could be used by Israel to gain Palestinian users' WhatsApp data.
Bahraini activist Esra'a Al Shafei, founder of Majal.org, told the Middle East Monitor that the reports that WhatsApp user data has been used by the IDF's AI machine demonstrate why privacy advocates warn against the collection and storage of metadata, "particularly for apps like WhatsApp, which falsely advertise their product as fully private."
"Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user's phone number is attached to other Meta products and related activities," Al Shafei said. "This is why the IDF could plausibly utilize metadata to track and locate WhatsApp users."
While Meta and WhatsApp may not necessarily be collaborating with Israel, she said, "by the very act of collecting this information, they're making themselves vulnerable to abuse and intrusive external surveillance."
In turn, "by using WhatsApp, people are risking their lives," she added.
A WhatsApp spokesperson told Anadolu last month that "WhatsApp has no backdoors and we do not provide bulk information to any government," adding that "Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested."
Al Shafei said Meta must "fully investigate" how WhatsApp's metadata may be used "to track, harm, or kill its users throughout Palestine."
"WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app," she said, "or what WhatsApp and Meta will do to proactively protect them from such misuse."
The Lavender AI system is a new weapon developed by Israel, but the kind of kill lists that it generates have a long pedigree in U.S. wars, occupations, and CIA regime change operations.
The Israeli online magazine +972 has published a detailed report on Israel’s use of an artificial intelligence system called “Lavender” to target thousands of Palestinian men in its bombing campaign in Gaza. When Israel attacked Gaza after October 7, the Lavender system had a database of 37,000 Palestinian men with suspected links to Hamas or Palestinian Islamic Jihad.
Lavender assigns a numerical score, from 1 to 100, to every man in Gaza, based mainly on cellphone and social media data, and automatically adds those with high scores to its kill list of suspected militants. Israel uses another automated system, known as “Where’s Daddy?”, to call in airstrikes to kill these men and their families in their homes.
The report is based on interviews with six Israeli intelligence officers who have worked with these systems. As one of the officers explained to +972, by adding a name from a Lavender-generated list to the Where’s Daddy home tracking system, he can place the man’s home under constant drone surveillance, and an airstrike will be launched once he comes home.
Just as U.S. weapons development aims to be at the cutting edge, or the killing edge, of new technology, the CIA and U.S. military intelligence have always tried to use the latest data processing technology to identify and kill their enemies.
The officers said the “collateral” killing of the men’s extended families was of little consequence to Israel. “Let’s say you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house],” the officer said. “Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.”
The officers explained that the decision to target thousands of these men in their homes is just a question of expediency. It is simply easier to wait for them to come home to the address on file in the system, and then bomb that house or apartment building, than to search for them in the chaos of the war-torn Gaza Strip.
The officers who spoke to 972+ explained that in previous Israeli massacres in Gaza, they could not generate targets quickly enough to satisfy their political and military bosses, and so these AI systems were designed to solve that problem for them. The speed with which Lavender can generate new targets only gives its human minders an average of 20 seconds to review and rubber-stamp each name, even though they know from tests of the Lavender system that at least 10% of the men chosen for assassination and familicide have only an insignificant or a mistaken connection with Hamas or Palestinian Islamic Jihad (PIJ).
The Lavender AI system is a new weapon, developed by Israel. But the kind of kill lists that it generates have a long pedigree in U.S. wars, occupations, and CIA regime change operations. Since the birth of the CIA after the Second World War, the technology used to create kill lists has evolved from the CIA’s earliest coups in Iran and Guatemala, to Indonesia and the Phoenix Program in Vietnam in the 1960s, to Latin America in the 1970s and 1980s and to the U.S. occupations of Iraq and Afghanistan.
Just as U.S. weapons development aims to be at the cutting edge, or the killing edge, of new technology, the CIA and U.S. military intelligence have always tried to use the latest data processing technology to identify and kill their enemies.
The CIA learned some of these methods from German intelligence officers captured at the end of the Second World War. Many of the names on Nazi kill lists were generated by an intelligence unit called Fremde Heere Ost (Foreign Armies East), under the command of Major General Reinhard Gehlen, Germany’s spy chief on the eastern front (see David Talbot, The Devil’s Chessboard, p. 268).
Gehlen and the FHO had no computers, but they did have access to 4 million Soviet POWs from all over the USSR, and no compunction about torturing them to learn the names of Jews and communist officials in their hometowns to compile kill lists for the Gestapo and Einsatzgruppen.
After the war, like the 1,600 German scientists spirited out of Germany in Operation Paperclip, the United States flew Gehlen and his senior staff to Fort Hunt in Virginia. They were welcomed by Allen Dulles, soon to be the first and still the longest-serving director of the CIA. Dulles sent them back to Pullach in occupied Germany to resume their anti-Soviet operations as CIA agents. The Gehlen Organization formed the nucleus of what became the BND, the new West German intelligence service, with Reinhard Gehlen as its director until he retired in 1968.
After a CIA coup removed Iran’s popular, democratically elected Prime Minister Mohammad Mosaddegh in 1953, a CIA team led by U.S. Major General Norman Schwarzkopf trained a new intelligence service, known as SAVAK, in the use of kill lists and torture. SAVAK used these skills to purge Iran’s government and military of suspected communists and later to hunt down anyone who dared to oppose the Shah.
By 1975, Amnesty Internationalestimated that Iran was holding between 25,000 and 100,000 political prisoners, and had “the highest rate of death penalties in the world, no valid system of civilian courts, and a history of torture that is beyond belief.”
In Guatemala, a CIA coup in 1954 replaced the democratic government of Jacobo Arbenz Guzman with a brutal dictatorship. As resistance grew in the 1960s, U.S. special forces joined the Guatemalan army in a scorched-earth campaign in Zacapa, which killed 15,000 people to defeat a few hundred armed rebels. Meanwhile, CIA-trained urban death squads abducted, tortured, and killed PGT (Guatemalan Labor Party) members in Guatemala City, notably 28 prominent labor leaders who were abducted and disappeared in March 1966.
Once this first wave of resistance was suppressed, the CIA set up a new telecommunications center and intelligence agency, based in the presidential palace. It compiled a database of “subversives” across the country that included leaders of farming co-ops and labor, student, and Indigenous activists to provide ever-growing lists for the death squads. The resulting civil war became a genocide against Indigenous people in Ixil and the western highlands that killed or disappeared at least 200,000 people.
This pattern was repeated across the world, wherever popular, progressive leaders offered hope to their people in ways that challenged U.S. interests. As historian Gabriel Kolko wrote in 1988, “The irony of U.S. policy in the Third World is that, while it has always justified its larger objectives and efforts in the name of anticommunism, its own goals have made it unable to tolerate change from any quarter that impinged significantly on its own interests.”
When General Suharto seized power in Indonesia in 1965, the U.S. Embassy compiled a list of 5,000 communists for his death squads to hunt down and kill. The CIA estimated that they eventually killed 250,000 people, while other estimates run as high as a million.
Twenty-five years later, journalist Kathy Kadane investigated the U.S. role in the massacre in Indonesia, and spoke to Robert Martens, the political officer who led the State-CIA team that compiled the kill list. “It really was a big help to the army,” Martens told Kadane. “They probably killed a lot of people, and I probably have a lot of blood on my hands. But that’s not all bad—there’s a time when you have to strike hard at a decisive moment.”
Kadane also spoke to former CIA director William Colby, who was the head of the CIA’s Far East division in the 1960s. Colby compared the U.S. role in Indonesia to the Phoenix Program in Vietnam, which was launched two years later, claiming that they were both successful programs to identify and eliminate the organizational structure of America’s communist enemies.
The Phoenix Program was designed to uncover and dismantle the National Liberation Front’s (NLF) shadow government across South Vietnam. Phoenix’s Combined Intelligence Center in Saigon fed thousands of names into an IBM 1401 computer, along with their locations and their alleged roles in the NLF. The CIA credited the Phoenix Program with killing 26,369 NLF officials, while another 55,000 were imprisoned or persuaded to defect. Seymour Hersh reviewed South Vietnamese government documents that put the death toll at 41,000.
How many of the dead were correctly identified as NLF officials may be impossible to know, but Americans who took part in Phoenix operations reported killing the wrong people in many cases. Navy SEAL Elton Manzione told author Douglas Valentine (The Phoenix Program) how he killed two young girls in a night raid on a village, and then sat down on a stack of ammunition crates with a hand grenade and an M-16, threatening to blow himself up, until he got a ticket home.
“The whole aura of the Vietnam War was influenced by what went on in the ‘hunter-killer’ teams of Phoenix, Delta, etc,” Manzione told Valentine. “That was the point at which many of us realized we were no longer the good guys in the white hats defending freedom—that we were assassins, pure and simple. That disillusionment carried over to all other aspects of the war and was eventually responsible for it becoming America’s most unpopular war.”
Even as the U.S. defeat in Vietnam and the “war fatigue” in the United States led to a more peaceful next decade, the CIA continued to engineer and support coups around the world, and to provide post-coup governments with increasingly computerized kill lists to consolidate their rule.
After supporting General Augusto Pinochet’s coup in Chile in 1973, the CIA played a central role in Operation Condor, an alliance between right-wing military governments in Argentina, Brazil, Chile, Uruguay, Paraguay, and Bolivia to hunt down tens of thousands of their and each other’s political opponents and dissidents, killing and disappearing at least 60,000 people.
Nicolas has at least two good friends who survived the dirty wars in Latin America because someone who worked in the police or military got word to them that their names were on a death list, one in Argentina, the other in Guatemala. If their fates had been decided by an AI machine like Lavender, they would both be long dead.
The CIA’s role in Operation Condor is still shrouded in secrecy, but Patrice McSherry, a political scientist at Long Island University, has investigated the U.S. role and concluded, “Operation Condor also had the covert support of the U.S. government. Washington provided Condor with military intelligence and training, financial assistance, advanced computers, sophisticated tracking technology, and access to the continental telecommunications system housed in the Panama Canal Zone.”
McSherry’s research revealed how the CIA supported the intelligence services of the Condor states with computerized links, a telex system, and purpose-built encoding and decoding machines made by the CIA Logistics Department. As she wrote in her book, Predatory States: Operation Condor and Covert War in Latin America:
The Condor system’s secure communications system, Condortel,... allowed Condor operations centers in member countries to communicate with one another and with the parent station in a U.S. facility in the Panama Canal Zone. This link to the U.S. military-intelligence complex in Panama is a key piece of evidence regarding secret U.S. sponsorship of Condor…
Operation Condor ultimately failed, but the U.S. provided similar support and training to right-wing governments in Colombia and Central America throughout the 1980s in what senior military officers have called a “quiet, disguised, media-free approach” to repression and kill lists.
The U.S. School of the Americas (SOA) trained thousands of Latin American officers in the use of torture and death squads, as Major Joseph Blair, the SOA’s former chief of instruction described to John Pilger for his film, The War You Don’t See:
The doctrine that was taught was that, if you want information, you use physical abuse, false imprisonment, threats to family members, and killing. If you can’t get the information you want, if you can’t get the person to shut up or stop what they’re doing, you assassinate them—and you assassinate them with one of your death squads.
When the same methods were transferred to the U.S. hostile military occupation of Iraq after 2003, Newsweek headlined it “The Salvador Option.” A U.S. officer explained to Newsweek that U.S. and Iraqi death squads were targeting Iraqi civilians as well as resistance fighters. “The Sunni population is paying no price for the support it is giving to the terrorists,” he said. “From their point of view, it is cost-free. We have to change that equation.”
The United States sent two veterans of its dirty wars in Latin America to Iraq to play key roles in that campaign. Colonel James Steele led the U.S. Military Advisor Group in El Salvador from 1984 to 1986, training and supervising Salvadoran forces who killed tens of thousands of civilians. He was also deeply involved in the Iran-Contra scandal, narrowly escaping a prison sentence for his role supervising shipments from Ilopango air base in El Salvador to the U.S.-backed Contras in Honduras and Nicaragua.
In Iraq, Steele oversaw the training of the Interior Ministry’s Special Police Commandos—rebranded as “National” and later “Federal” Police after the discovery of their al-Jadiriyah torture center and other atrocities.
Bayan al-Jabr, a commander in the Iranian-trained Badr Brigade militia, was appointed interior minister in 2005, and Badr militiamen were integrated into the Wolf Brigade death squad and other Special Police units. Jabr’s chief adviser was Steven Casteel, the former intelligence chief for the U.S. Drug Enforcement Agency (DEA) in Latin America.
The Interior Ministry death squads waged a dirty war in Baghdad and other cities, filling the Baghdad morgue with up to 1,800 corpses per month, while Casteel fed the Western media absurd cover stories, such as that the death squads were all “insurgents” in stolen police uniforms.
Meanwhile U.S. special operations forces conducted “kill-or-capture” night raids in search of Resistance leaders. General Stanley McChrystal, the commander of Joint Special Operations Command from 2003-2008, oversaw the development of a database system, used in Iraq and Afghanistan, that compiled cellphone numbers mined from captured cellphones to generate an ever-expanding target list for night raids and air strikes.
The targeting of cellphones instead of actual people enabled the automation of the targeting system, and explicitly excluded using human intelligence to confirm identities. Two senior U.S. commanders told The Washington Post that only half the night raids attacked the right house or person.
In Afghanistan, President Barack Obama put McChrystal in charge of U.S. and NATO forces in 2009, and his cellphone-based “social network analysis” enabled an exponential increase in night raids, from 20 raids per month in May 2009 to up to 40 per night by April 2011.
As with the Lavender system in Gaza, this huge increase in targets was achieved by taking a system originally designed to identify and track a small number of senior enemy commanders and applying it to anyone suspected of having links with the Taliban, based on their cellphone data.
This led to the capture of an endless flood of innocent civilians, so that most civilian detainees had to be quickly released to make room for new ones. The increased killing of innocent civilians in night raids and airstrikes fueled already fierce resistance to the U.S. and NATO occupation and ultimately led to its defeat.
President Obama’s drone campaign to kill suspected enemies in Pakistan, Yemen, and Somalia was just as indiscriminate, with reports suggesting that 90% of the people it killed in Pakistan were innocent civilians.
And yet Obama and his national security team kept meeting in the White House every “Terror Tuesday” to select who the drones would target that week, using an Orwellian, computerized “disposition matrix” to provide technological cover for their life and death decisions.
Looking at this evolution of ever more automated systems for killing and capturing enemies, we can see how, as the information technology used has advanced from telexes to cellphones and from early IBM computers to artificial intelligence, the human intelligence and sensibility that could spot mistakes, prioritize human life, and prevent the killing of innocent civilians has been progressively marginalized and excluded, making these operations more brutal and horrifying than ever.
Nicolas has at least two good friends who survived the dirty wars in Latin America because someone who worked in the police or military got word to them that their names were on a death list, one in Argentina, the other in Guatemala. If their fates had been decided by an AI machine like Lavender, they would both be long dead.
As with supposed advances in other types of weapons technology, like drones and “precision” bombs and missiles, innovations that claim to make targeting more precise and eliminate human error have instead led to the automated mass murder of innocent people, especially women and children, bringing us full circle from one holocaust to the next.
The rise of AI belies the fantasy that we can escape moral culpability by assigning life-or-death decisions to a machine.
The war in Gaza between Israel and Hamas marked its grim six-month anniversary on Sunday, and one of the most jarring things about this very 21st-century conflict has been the almost daily headlines about Israeli airstrikes obliterating the homes of notable Palestinians—sometimes known Hamas operatives, but often journalists or physicians or aid workers. In many of these attacks, large numbers of family members, including young children, die under the rubble.
In one of the war’s most notorious incidents, the prominent Palestinian poet and professor Refaat Alareer—so haunted by the daily devastation and the likelihood he and his own family would be targeted that in his final weeks he wrote a poem called “If I Must Die”—had sought refuge at a family home when an Israeli airstrike killed not only him but his brother, sister, and four children.
It’s a similar story for journalists in Gaza, whose death toll—at least 90 Palestinians, according to the conservative tally of the Committee to Protect Journalists—has exceeded any other modern conflict, in just half a year. Just two days after Alareer’s death in December, an Israeli airstrike on the home of reporter Ola Attallah killed not only her but nine members of her family. That same week, Abdalhamid Abdelati, head of the Al Sawt Al Sha’b radio station, wasn’t home when an Israeli bomb struck, but his mother, brother, sister, and four other family members were killed.
The reality—whether the fighting is with sticks and stones or cruise missiles guided by supercomputers—is that the horror of war always hinges on our tragic flaw to see some people as more human than others.
This frequent death from above—in a war that has claimed the lives of more than 33,000 Palestinians, the majority of them women and children—has raised some uncomfortable questions about Israel’s conduct in the conflict. Just who are Israeli commanders targeting for these deadly strikes? And how are there targets selected?
Last week, an investigative report from +972 Magazine and Local Call—led by independent, left-leaning Israeli journalists covering the region from Tel Aviv—offered some answers that are deeply disturbing, posing questions about the blurred lines between artificial intelligence and real morality that cut to the core of our basic humanity.
The +972 Magazine report, which it confirmed in interviews with six Israeli intelligence officers, said an AI program known as “Lavender” has been used by the Israeli Defense Force to identify targets in Gaza since the start of the war on October 7. The IDF has confirmed that AI is used by its intelligence officers in guiding its tactics in Gaza, but the military and the magazine differed sharply on the issue of human involvement. IDF claims the computer-driven data is only advisory and that humans are still making the key decisions for targeting bombs, but the +972 report said human reviews of the AI targets were often “a rubber stamp” as brief as 20 seconds.
The 20-second finding is deeply troubling, and yet arguably not the most bothersome disclosure in the magazine’s investigation. For one thing, our faith in computers is centered on the notion that advanced technology might reduce the fatal mistakes that occur during the fog of war, but +972 reported that Lavender’s error rate in targeting is still 10%—bombing someone with the same name as a Hamas member, for example, or even someone who just inherited a phone number.
But even when Israeli intelligence, with the help of AI, pinpoints a target that it suspects of ties to Hamas—which triggered the war when it killed 1,200 Israelis in a surprise attack and which still holds 130 or so hostages—it faces a decision on the fate of civilians who might be in the home or nearby. In the early weeks of the war, +972 reported, Israeli commanders deemed it was OK to kill as many as 15 or 20 innocent civilians for every low-level Hamas operative targeted, a number that rose to the hundreds for higher-level Hamas leaders.
What’s more, the intelligence sources told +972 that alleged low-level Hamas operatives were targeted with so-called dumb bombs, less-precise weaponry that leads to more collateral killing, in order to save more expensive “smart bombs” for their higher-ups. In an earlier report last fall on Israel’s use of AI, a military source told +972: “When a three-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed—that it was a price worth paying in order to hit [another] target.” The devolution from biblical commandment in the Book of Exodus of “an eye-for-an-eye” justice to a 20-1 kill ratio is hard to fathom.
Yet not as unfathomable as this: +972 also reported that the Lavender AI excelled at tracking its selected targets to their homes, where it was deemed an airstrike would be most successful. Of course, the target’s home is also where their wives, children, and other family members would be killed in their sleep. Even worse, the AI was better at finding the target’s home than at determining if he was actually there. “It happened to me many times that we attacked a house, but the person wasn’t even home,” one source told +972. “The result is that you killed a family for no reason.”
The Orwellian name for this particular operation? “Where’s Daddy?”
In one sense, the exposure of Israel’s Lavender program is the latest twist on a story that goes back hundreds of years—the use of new technologies not to better humankind but to destroy it. Think about the machine guns that made a mockery of front-line warfare in World War I, the Nazis harnessing Zyklon B gas for their death camps, or America dropping atomic bombs on the civilians of Hiroshima and Nagasaki. But the rise of AI brings this to a troubling if inevitable level: The fantasy that we can escape moral culpability by assigning life-or-death decisions to a machine.
If nothing else, these disturbing stories coming out of Gaza should confirm that the rise of AI, and the decisions we have to make as a society about how to deploy it, is the issue that will come to define who we are in the 21st century. How can we make the most out of this remarkable technology—identifying cures for diseases, for example—while avoiding the many pitfalls from classroom plagiarism to rising unemployment as machines learn to perform human jobs?
But are we even asking the right questions? The ultimate fear around AI has always been that the machines become sentient, rise up, and overthrow humankind. It’s not wrong to worry about that dystopian scenario, but isn’t it just as bad when we abdicate our morality by farming out mass death to a computer program? Perhaps someday artificial intelligence will evolve to a higher state of consciousness, but right now Lavender is issuing death warrants for toddlers and their mothers because it reflects the inhumanity that we programmed it with.
The +972 report happened to come out at a moment when Americans, including our leaders, may be more receptive to hearing it. We don’t yet know if AI played a role in Israel’s decision last week to target a convoy of aid workers for renowned chef Jose Andrés’ World Central Kitchen, killing seven people, including a Canadian American veteran of the war in Afghanistan. But the shocking incident has led even some of Israel’s staunchest allies in Congress like former Speaker Nancy Pelosi (D-Calif.) to call for restrictions on the massive flow of U.S. weapons, unless the bloodshed is reduced.
After just one phone call from a clearly angered President Joe Biden to Israeli Prime Minister Benjamin Netanyahu, Israel approved a new food corridor into famine-stricken Gaza. That one small step for humanity also begged the questions of why Biden and U.S. officials didn’t use their leverage sooner, and why an attack on Western workers for politicians’ favorite Beltway chef provoked a response when the killing of innocent Palestinian babies did not?
The reality—whether the fighting is with sticks and stones or cruise missiles guided by supercomputers—is that the horror of war always hinges on our tragic flaw to see some people as more human than others. Last week, former U.S. State Department official Aaron David Miller tried to explain to The New Yorker’s Isaac Chotiner why Biden had a more visceral reaction to the October 7 Hamas attack on Israel than to its massive retaliation, even as the deaths of children mounted. “Do I think that Joe Biden has the same depth of feeling and empathy for the Palestinians of Gaza as he does for the Israelis?” Miller asked rhetorically. “No, he doesn’t, nor does he convey it. I don’t think there’s any doubt about that.”
Whatever flaws exist in an AI program such as Lavender, the real computer bug is the lack of human love and understanding that we program it with, the ghost in the machine. Artificial intelligence won’t have the capacity to distinguish between a Hamas killer or an acclaimed poet or the baby in a crib nearby unless we input the values that cherish all innocent human life equally. It wasn’t AI that named a poorly discriminating murder program “Where’s Daddy?”; it was humans, who programmed it with a moral compass that is spinning out of control. Until we can rediscover our depth of feeling and empathy, war by computer is merely garbage in, garbage out.