Pro-Palestinian protester arrested at UT Austin.

A pro-Palestinian protester is arrested at the University of Texas in Austin, Texas, on April 29, 2024.

(Photo: Suzanne Cordeiro/AFP via Getty Images)

The AI Push Is Reigniting the Battle for the Soul of the US Academy

The moment requires greater public pushback against the military conquest of America’s research and security agendas, in part through resistance by scientists and engineers.

The divestment campaigns launched last spring by students protesting Israel’s mass slaughter in Gaza brought the issue of the militarization of American higher education back into the spotlight.

Of course, financial ties between the Pentagon and American universities are nothing new. As Stuart Leslie has pointed out in his seminal book on the topic, The Cold War and American Science, “In the decade following World War II, the Department of Defense (DOD) became the biggest patron of American science.” Admittedly, as civilian institutions like the National Institutes of Health grew larger, the Pentagon’s share of federal research and development did decline, but it still remained a source of billions of dollars in funding for university research.

In 2022, the most recent year for which full data is available, 14 universities received at least—and brace yourself for this—$100 million in Pentagon funding.

And now, Pentagon-funded research is once again on the rise, driven by the DOD’s recent focus on developing new technologies like weapons driven by artificial intelligence (AI). Combine that with an intensifying drive to recruit engineering graduates and the forging of partnerships between professors and weapons firms and you have a situation in which many talented technical types could spend their entire careers serving the needs of the warfare state. The only way to head off such a Brave New World would be greater public pushback against the military conquest (so to speak) of America’s research and security agendas, in part through resistance by scientists and engineers whose skills are so essential to building the next generation of high-tech weaponry.

The Pentagon Goes to School

Yes, the Pentagon’s funding of universities is indeed rising once again and it goes well beyond the usual suspects like MIT or Johns Hopkins University. In 2022, the most recent year for which full data is available, 14 universities received at least—and brace yourself for this—$100 million in Pentagon funding, from Johns Hopkins’s astonishing $1.4 billion (no, that is not a typo!) to Colorado State’s impressive $100 million. And here’s a surprise: Two of the universities with the most extensive connections to our weaponry of the future are in Texas: the University of Texas at Austin (UT-Austin) and Texas A&M.

In 2020, Texas Gov. Greg Abbott and former Army Secretary Ryan McCarthy appeared onstage at a UT-Austin ceremony to commemorate the creation of a robotics lab there, part of a new partnership between the Army Futures Command and the school. “This is ground zero for us in our research for the weapons systems we’re going to develop for decades to come,” said McCarthy.

Not to be outdone, Texas A&M is quietly becoming the Pentagon’s base for research on hypersonics—weapons expected to travel five times the speed of sound. Equipped with a kilometer-long tunnel for testing hypersonic missiles, that school’s University Consortium for Applied Hypersonics is explicitly dedicated to outpacing America’s global rivals in the development of that next generation military technology. Texas A&M is also part of the team that runs the Los Alamos National Laboratory, the (in)famous New Mexico facility where the first nuclear weapons were developed and tested as part of the Manhattan Project under the direction of Robert Oppenheimer.

“I don’t really feel like I need to be putting my gifts to make more bombs.”

Other major players include Carnegie Mellon University, a center for Army research on the applications of AI, and Stanford University, which serves as a feeder to California’s Silicon Valley firms of all types. That school also runs the Technology Transfer for Defense (TT4D) Program aimed at transitioning academic technologies from the lab to the marketplace and exploring the potential military applications of emerging technology products.

In addition, the Pentagon is working aggressively to bring new universities into the fold. In January 2023, Secretary of Defense Lloyd Austin announced the creation of a defense-funded research center at Howard University, the first of its kind at a historically Black college.

Given the campus Gaza demonstrations of last spring, perhaps you also won’t be surprised to learn that the recent surge in Pentagon spending faces increasing criticism from students and faculty alike. Targets of protest include the Lavender program, which has used AI to multiply the number of targets the Israeli armed forces can hit in a given time frame. But beyond focusing on companies enabling Israel’s war effort, current activists are also looking at the broader role of their universities in the all-American war system.

For example, at Indiana University research on ties to companies fueling the killings in Gaza grew into a study of the larger role of universities in supporting the military system as a whole. Student activists found that the most important connection involved that university’s ties to the Naval Surface Warfare Center, Crane Division, whose mission is “to provide acquisition, engineering… and technical support for sensors, electronics, electronic warfare, and special warfare weapons.” In response, student activists have launched a “Keep Crane Off Campus” campaign.

A Science of Death or for Life?

Graduating science and engineering students increasingly face a moral dilemma about whether they want to put their skills to work developing instruments of death. Journalist Indigo Olivier captured that conflict in a series of interviews with graduating engineering students. She quotes one at the University of West Florida who strongly opposes doing weapons work this way: “When it comes to engineering, we do have a responsibility… Every tool can be a weapon… I don’t really feel like I need to be putting my gifts to make more bombs.” By contrast, Cameron Davis, a 2021 computer engineering graduate from Georgia Tech, told Olivier about the dilemma faced by so many graduating engineers: “A lot of people that I talk to aren’t 100% comfortable working on defense contracts, working on things that are basically going to kill people.” But he went on to say that the high pay at weapons firms “drives a lot of your moral disagreements with defense away.”

The choice faced by today’s science and engineering graduates is nothing new. The use of science for military ends has a long history in the United States. But there have also been numerous examples of scientists who resisted dangerous or seemingly unworkable military schemes. When President Ronald Reagan announced his “Star Wars” missile defense plan in 1986, for instance, he promised, all too improbably, to develop an impenetrable shield that would protect the United States from any and all incoming nuclear-armed missiles. In response, physicists David Wright and Lisbeth Gronlund circulated a pledge to refuse to work on that program. It would, in the end, be signed by more than 7,000 scientists. And that document actually helped puncture the mystique of the Star Wars plan, a reminder that protest against the militarization of education isn’t always in vain.

James E. Mitchell, a psychologist under contract to U.S. intelligence, helped develop the “enhanced interrogation techniques” used by the U.S during its post-9/11 “war on terror,” even sitting in on a session in which a prisoner was waterboarded.

Scientists have also played a leading role in pressing for nuclear arms control and disarmament, founding organizations like the Bulletin of the Atomic Scientists (1945), the Federation of American Scientists (1945), the global Pugwash movement (1957), the Council for a Livable World (1962), and the Union of Concerned Scientists (1969). To this day, all of them continue to work to curb the threat of a nuclear war that could destroy this planet as a livable place for humanity.

A central figure in this movement was Joseph Rotblat, the only scientist to resign from the Manhattan Project over moral qualms about the potential impact of the atomic bomb. In 1957, he helped organize the founding meeting of the Pugwash Conference, an international organization devoted to the control and ultimate elimination of nuclear weapons. In some respects Pugwash was a forerunner of the International Campaign to Abolish Nuclear Weapons (ICAN), which successfully pressed for the U.N. Treaty on the Prohibition of Nuclear Weapons, which entered into force in January 2021.

Enabling Endless War and Widespread Torture

The social sciences also have a long, conflicted history of ties to the Pentagon and the military services. Two prominent examples from earlier in this century were the Pentagon’s Human Terrain Program (HTS) and the role of psychologists in crafting torture programs associated with the Global War on Terror, launched after the 9/11 attacks with the invasion of Afghanistan.

The HTS was initially intended to reduce the “cultural knowledge gap” suffered by U.S. troops involved in counterinsurgency operations in Afghanistan and Iraq early in this century. The theory was that military personnel with a better sense of local norms and practices would be more effective in winning “hearts and minds” and so defeating determined enemies on their home turf. The plan included the deployment of psychologists, anthropologists, and other social scientists in Human Terrain Teams alongside American troops in the field.

Launched in 2007, the program sparked intense protests in the academic community, with a particularly acrimonious debate within the American Anthropological Association. Ed Liebow, the executive director of the association, argued that its debate “convinced a very large majority of our members that it was just not a responsible way for professional anthropologists to conduct themselves.” After a distinctly grim history that included “reports of racism, sexual harassment, and payroll padding,” as well as a belief by many commanders that Human Terrain Teams were simply ineffective, the Army quietly abandoned the program in 2014.

The Future of Life Institute has underscored the severity of the risk, noting that “more than half of AI experts believe there is a one in ten chance this technology will cause our extinction.”

An even more controversial use of social scientists in the service of the war machine was the role of psychologists as advisors to the CIA’s torture programs at Abu Ghraib in Iraq, the Guantánamo Bay detention center in Cuba, and other of that agency’s “black sites.” James E. Mitchell, a psychologist under contract to U.S. intelligence, helped develop the “enhanced interrogation techniques” used by the U.S during its post-9/11 “war on terror,” even sitting in on a session in which a prisoner was waterboarded. That interrogation program, developed by Mitchell with psychologist John Bruce Jessom, included resorting to “violence, sleep deprivation, and humiliation.”

The role of psychologists in crafting the CIA’s torture program drew harsh criticism within the profession. A 2015 report by independent critics revealed that the leaders of the American Psychological Association had “secretly collaborated with the administration of President George W. Bush to bolster a legal and ethical justification for the torture of prisoners swept up in the post-Sept. 11 war on terror.” Over time, it became ever clearer that the torture program was not only immoral but remarkably ineffective, since the victims of such torture often told interrogators what they wanted to hear, whether or not their admissions squared with reality.

That was then, of course. But today, resistance to the militarization of science has extended to the growing use of artificial intelligence and other emerging military technologies. For example, in 2018, there was a huge protest movement at Google when employees learned that the company was working on Project Maven, a communications network designed to enable more accurate drone strikes. More than 4,000 Google scientists and engineers signed a letter to company leadership calling for them to steer clear of military work, dozens resigned over the issue, and the protests had a distinct effect on the company. That year, Google announced that it would not renew its Project Maven contract, and pledged that it “will not design or deploy AI” for weapons.

Unfortunately, the lure of military funding was simply too strong. Just a few years after those Project Maven protests, Google again began doing work for the Pentagon, as noted in a 2021 New York Times report by Daisuke Wakabayashi and Kate Conger. Their article pointed to Google’s “aggressive pursuit” of the Joint Warfighting Cloud Capability project, which will attempt to “modernize the Pentagon’s cloud technology and support the use of artificial intelligence to gain an advantage on the battlefield.” (Cloud technology is the term for the delivery of computing services over the internet.)

Meanwhile, a cohort of Google workers has continued to resist such military projects. An October 2021 letter in the British Guardian from “Google and Amazon workers of conscience” called on the companies to “pull out of Project Nimbus [a $1.2 billion contract to provide cloud computing services to the Israeli military and government] and cut all ties with the Israeli military.” As they wrote then, “This contract was signed the same week that the Israeli military attacked Palestinians in the Gaza Strip—killing nearly 250 people, including more than 60 children. The technology our companies have contracted to build will make the systematic discrimination and displacement carried out by the Israeli military and government even crueler and deadlier for Palestinians.”

Of course, their demand seems even more relevant today in the context of the war on Gaza that had then not officially begun.

The Future of American Science

Obviously, many scientists do deeply useful research on everything from preventing disease to creating green-energy options that has nothing to do with the military. But the current increases in weapons research could set back such efforts by soaking up an ever larger share of available funds, while also drawing ever more top talent into the military sphere.

The stakes are particularly high now, given the ongoing rush to develop AI-driven weaponry and other emerging technologies that pose the risk of everything from unintended slaughter due to system malfunctions to making war more likely, given the (at least theoretical) ability to limit casualties for the attacking side. In short, turning back the flood of funding for military research and weaponry from the Pentagon and key venture capital firms will be a difficult undertaking. After all, AI is already performing a wide range of military and civilian tasks. Banning it altogether may no longer be a realistic goal, but putting guardrails around its military use might still be.

Such efforts are, in fact, already underway. The International Committee for Robot Arms Control (ICRAC) has called for an international dialogue on “the pressing dangers that these systems pose to peace and international security and to civilians.” ICRAC elaborates on precisely what these risks are: “Autonomous systems have the potential to accelerate the pace and tempo of warfare, to undermine existing arms controls and regulations, to exacerbate the dangers of asymmetric warfare, and to destabilize regional and global security, [as well as to] further the indiscriminate and disproportionate use of force and obscure the moral and legal responsibility for war crimes.”

The Future of Life Institute has underscored the severity of the risk, noting that “more than half of AI experts believe there is a one in ten chance this technology will cause our extinction.”

Instead of listening almost exclusively to happy talk about the military value of AI by individuals and organizations that stand to profit from its adoption, isn’t it time to begin paying attention to the skeptics, while holding back on the deployment of emerging military technologies until there is a national conversation about what they can and can’t accomplish, with scientists playing a central role in bringing the debate back to Earth?

© 2023 TomDispatch.com