Jul 27, 2015
More than 1,000 science and technology experts on Monday published an open letter calling for a ban on autonomous weapons--machines capable of killing without human operators--to prevent a "virtually inevitable" high-stakes global arms race.
Among the signatories are physicist and professor Stephen Hawking, Apple co-founder Steve Wozniak, and Tesla CEO Elon Musk, among many others. The letter was presented at the International Joint Conferences on Artificial Intelligence (IJCAI) in Buenos Aires, Argentina.
"Autonomous weapons select and engage targets without human intervention," the letter states. "Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is--practically if not legally--feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
The letter continues:
If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.... It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.
Proponents of autonomous weapons say the machines would be useful in reducing military casualties on the battlefield. But the letter's authors counter that, in doing so, the weapons would lower the threshold for armed conflict--risking more frequent battles and a greater loss of civilian life.
They state: "There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."
The letter is the most recent call for a preemptive ban on the weapons, often referred to colloquially as killer robots.
In April, a joint report by Human Rights Watch and Harvard Law School's International Human Rights Clinic found that autonomous weapons present "serious moral and legal concerns" and could not only violate international law, but make it virtually impossible to pursue accountability for victims.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots and presented to the United Nations meeting on lethal weapons, called on the international body to ban such tools before they can be created.
While fully autonomous weapons do not yet exist, HRW noted, their prototypes--such as the U.S. army's Phalanx CIWS and Israel's Iron Dome--are already in use.
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Nadia Prupis
Nadia Prupis is a former Common Dreams staff writer. She wrote on media policy for Truthout.org and has been published in New America Media and AlterNet. She graduated from UC Santa Barbara with a BA in English in 2008.
More than 1,000 science and technology experts on Monday published an open letter calling for a ban on autonomous weapons--machines capable of killing without human operators--to prevent a "virtually inevitable" high-stakes global arms race.
Among the signatories are physicist and professor Stephen Hawking, Apple co-founder Steve Wozniak, and Tesla CEO Elon Musk, among many others. The letter was presented at the International Joint Conferences on Artificial Intelligence (IJCAI) in Buenos Aires, Argentina.
"Autonomous weapons select and engage targets without human intervention," the letter states. "Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is--practically if not legally--feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
The letter continues:
If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.... It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.
Proponents of autonomous weapons say the machines would be useful in reducing military casualties on the battlefield. But the letter's authors counter that, in doing so, the weapons would lower the threshold for armed conflict--risking more frequent battles and a greater loss of civilian life.
They state: "There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."
The letter is the most recent call for a preemptive ban on the weapons, often referred to colloquially as killer robots.
In April, a joint report by Human Rights Watch and Harvard Law School's International Human Rights Clinic found that autonomous weapons present "serious moral and legal concerns" and could not only violate international law, but make it virtually impossible to pursue accountability for victims.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots and presented to the United Nations meeting on lethal weapons, called on the international body to ban such tools before they can be created.
While fully autonomous weapons do not yet exist, HRW noted, their prototypes--such as the U.S. army's Phalanx CIWS and Israel's Iron Dome--are already in use.
Nadia Prupis
Nadia Prupis is a former Common Dreams staff writer. She wrote on media policy for Truthout.org and has been published in New America Media and AlterNet. She graduated from UC Santa Barbara with a BA in English in 2008.
More than 1,000 science and technology experts on Monday published an open letter calling for a ban on autonomous weapons--machines capable of killing without human operators--to prevent a "virtually inevitable" high-stakes global arms race.
Among the signatories are physicist and professor Stephen Hawking, Apple co-founder Steve Wozniak, and Tesla CEO Elon Musk, among many others. The letter was presented at the International Joint Conferences on Artificial Intelligence (IJCAI) in Buenos Aires, Argentina.
"Autonomous weapons select and engage targets without human intervention," the letter states. "Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is--practically if not legally--feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
The letter continues:
If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.... It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.
Proponents of autonomous weapons say the machines would be useful in reducing military casualties on the battlefield. But the letter's authors counter that, in doing so, the weapons would lower the threshold for armed conflict--risking more frequent battles and a greater loss of civilian life.
They state: "There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."
The letter is the most recent call for a preemptive ban on the weapons, often referred to colloquially as killer robots.
In April, a joint report by Human Rights Watch and Harvard Law School's International Human Rights Clinic found that autonomous weapons present "serious moral and legal concerns" and could not only violate international law, but make it virtually impossible to pursue accountability for victims.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots and presented to the United Nations meeting on lethal weapons, called on the international body to ban such tools before they can be created.
While fully autonomous weapons do not yet exist, HRW noted, their prototypes--such as the U.S. army's Phalanx CIWS and Israel's Iron Dome--are already in use.
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.