World's top AI and robotics companies urge UN to ban lethal autonomous weapons

University of NSW
Monday, 21 August, 2017

World's top AI and robotics companies urge UN to ban lethal autonomous weapons

An open letter signed by 116 founders of robotics and artificial intelligence companies from 26 countries urges the United Nations to urgently address the challenge of lethal autonomous weapons (often called ‘killer robots’) and ban their use internationally.

A key organiser of the letter, Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales in Sydney, released it at the opening of the International Joint Conference on Artificial Intelligence (IJCAI 2017) in Melbourne, the world’s pre-eminent gathering of top experts in artificial intelligence (AI) and robotics. Walsh is a member of the IJCAI 2017’s conference committee.

The open letter is the first time that AI and robotics companies have taken a joint stance on the issue. Previously, only a single company, Canada’s Clearpath Robotics, had formally called for a ban on lethal autonomous weapons.

In December 2016, 123 member nations of the UN’s Review Conference of the Convention on Conventional Weapons unanimously agreed to begin formal discussions on autonomous weapons. Of these, 19 have already called for an outright ban.

“Lethal autonomous weapons threaten to become the third revolution in warfare,” the letter states. “Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend.

“These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close,” it states, concluding with an urgent plea for the UN “to find a way to protect us all from these dangers”.

Signatories of the 2017 letter include:

  • Elon Musk, founder of Tesla, SpaceX and OpenAI (USA)
  • Mustafa Suleyman, founder and Head of Applied AI at Google’s DeepMind (UK)
  • Esben Østergaard, founder and CTO of Universal Robotics (Denmark)
  • Jerome Monceaux, founder of Aldebaran Robotics, maker of Nao and Pepper robots (France)
  • Jürgen Schmidhuber, leading deep learning expert and founder of Nnaisense (Switzerland)
  • Yoshua Bengio, leading deep learning expert and founder of Element AI (Canada)

Their companies employ tens of thousands of researchers, roboticists and engineers, are worth billions of dollars and cover the globe: Australia, Canada, China, Czech Republic, Denmark, Estonia, Finland, France, Germany, Iceland, India, Ireland, Italy, Japan, Mexico, Netherlands, Norway, Poland, Russia, Singapore, South Africa, Spain, Switzerland, UAE, UK and USA.

Walsh is one of the organisers of the 2017 letter, as well as an earlier letter released in 2015 at the IJCAI conference in Buenos Aires, which warned of the dangers of autonomous weapons. The 2015 letter was signed by thousands of researchers in AI and robotics working in universities and research labs around the world, and was endorsed by British physicist Stephen Hawking, Apple co-founder Steve Wozniak and cognitive scientist Noam Chomsky, among others.

“Nearly every technology can be used for good and bad, and artificial intelligence is no different,” said Walsh. “It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis. However, the same technology can also be used in autonomous weapons to industrialise war.

“We need to make decisions today choosing which of these futures we want. I strongly support the call by many humanitarian and other organisations for a UN ban on such weapons, similar to bans on chemical and other weapons,” he added.

“Two years ago at this same conference, we released an open letter signed by thousands of researchers working in AI and robotics calling for such a ban. This helped push this issue up the agenda at the United Nations and begin formal talks. I am hopeful that this new letter, adding the support of the AI and robotics industry, will add urgency to the discussions at the UN that should have started today.”

“The number of prominent companies and individuals who have signed this letter reinforces our warning that this is not a hypothetical scenario, but a very real, very pressing concern which needs immediate action,” said Ryan Gariepy, founder and CTO of Clearpath Robotics, who was the first to sign.

“We should not lose sight of the fact that, unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability,” he added. “The development of lethal autonomous weapons systems is unwise, unethical and should be banned on an international scale.”

Yoshua Bengio, founder of Element AI and a leading ‘deep learning’ expert, said: “I signed the open letter because the use of AI in autonomous weapons hurts my sense of ethics, would be likely to lead to a very dangerous escalation, because it would hurt the further development of AI’s good applications, and because it is a matter that needs to be handled by the international community, similarly to what has been done in the past for some other morally wrong weapons (biological, chemical, nuclear).”

Stuart Russell, founder and Vice-President of Bayesian Logic, agreed: “Unless people want to see new weapons of mass destruction — in the form of vast swarms of lethal microdrones — spreading around the world, it’s imperative to step up and support the United Nations’ efforts to create a treaty banning lethal autonomous weapons. This is vital for national and international security.”

Story originally written by University of New South Wales, Australian Science Media Centre, and published by Scimex.

Image: QinetiQ MAARS (Mobile Advanced Armed Robotic System). Source: Scimex.

Related News

New robotics and automation precinct opens in WA

The WA Government has officially opened what it says will be Australia's largest robotics and...

International robot federated learning project a success

The FLAIROP international research project has shown AI federated learning across multiple...

Rockwell to partner with Taurob to provide robotic inspection solutions

Rockwell Automation has announced it will partner with Austrian company Taurob to provide a...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd