SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
'Don't Be Evil?': Outrage Over Google's Secret Program to Bolster Pentagon's Drone War

A pair of reports on Tuesday revealed that Google is providing resources to the Pentagon so analysts can more easily review footage captured by U.S. drones. (Photo: Steve Rhodes/flickr/cc)

'Don't Be Evil?': Outrage Over Google's Secret Program to Bolster Pentagon's Drone War

"I hadn't noticed the '...but it's okay to help with drone strikes' footnote to the 'don't be evil' line in Google's code of conduct."

Human rights advocates, tech experts, and critics of the United States' vast drone warfare program are outraged over the Google's secret agreement with the Pentagon--revealed in a pair of reports by Gizmodo and The Intercept--to develop artificial intelligence, or AI, that quickly analyzes drone footage.

Some critics pointed to Google's old motto, "Don't Be Evil," and the replacement, "Do the Right Thing," introduced in 2015 by Google's parent company, Alphabet.

The reports, published Tuesday, outline details of the partnership between Google and the U.S. Department of Defense's Project Maven that were recently disclosed on a company mailing list. The internal discussion reportedly angered some Google employees, who Gizmodo reports "were outraged that the company would offer resources to the military for surveillance technology involved in drone operations" and pointed out that "the project raised important ethical questions about the development and use of machine learning."

The DOD's Project Maven--also known as the Algorithmic Warfare Cross-Functional Team (AWCFT)--launched last April, and "was tasked with using machine learning to identify vehicles and other objects in drone footage, taking that burden off analysts" who haven't been able to keep up with the amount of footage collected by U.S. drones.

A spokesperson for Google said the company provides the Pentagon with "open source TensorFlow APIs that can assist in object recognition on unclassified data," and insisted "the technology flags images for human review, and is for non-offensive uses only."

However, The Intercept noted--pointing to earlier reports about the project--that the purpose of the AI tech is "to help drone analysts interpret the vast image data vacuumed up from the military's fleet of 1,100 drones to better target bombing strikes against the Islamic State."

While Google's spokesperson added that the company is "actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies," The Intercept also noted that "the military contract with Google is routed through a Northern Virginia technology staffing company called ECS Federal, obscuring the relationship from the public"--at least until it was revealed in Tuesday's reports.

Both reports also pointed out that Eric Schmidt, who recently stepped down as chairman of Alphabet, heads the Defense Innovation Board, a federal advisory committee established in 2016 "to encourage the military adoption of breakthrough technology," and which has developed recommendations for how the Department of Defense can better utilize tools from Silicon Valley to wage war abroad.

Gizmodo, citing meeting minutes, noted that "some members of the Board's teams are part of the executive steering group that is able to provide rapid input" on Project Maven, whose Pentagon director has expressed hope that the project will be "that spark that kindles the flame front of artificial intelligence across the rest of the [Defense] Department."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.