More than 50 leading artificial intelligence (AI) and robotics researchers have walked away from their boycott of the Korea Advanced Institute of Science and Technology (KAIST), after the university committed to not develop lethal autonomous systems.
To continue reading the rest of this article, please log in.
Create free account to get unlimited news articles and more!
The boycott came after news emerged that KAIST had opened an AI weapons lab in collaboration with major arms company, Hanwha Systems. The Korean arms company builds cluster munitions which are prohibited under the UN Convention on Cluster Munitions. While over 100 nations have signed the Convention on Cluster Munitions, South Korea is not one of them.
KAIST’s president, Professor Sung-Chul Shin, responded to the boycott by affirming in a statement that “KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots”. He went further by committing that “KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”
The 56 AI and robotics researchers who were signatories to the boycott have since rescinded the action and will once again visit and host researchers from KAIST, and collaborate on scientific projects.
Toby Walsh, scientia professor of artificial intelligence at the University of New South Wales in Sydney and leader of the boycott who initiated the action, praised KAIST for the rapid response.
"I was very pleased that the president of KAIST has agreed not to develop lethal autonomous weapons, and to follow international norms by ensuring meaningful human control of any AI-based weapon that will be developed,” he said. "I applaud KAIST for doing the right thing, and I’ll be happy to work with KAIST in the future.
Professor Walsh said the uses of AI are overwhelmingly positive but the game-changing technology should not become the next revolution in warfare.
"There are plenty of good applications for AI, even in a military setting. No one, for instance, should risk a life or limb clearing a minefield – this is a perfect job for a robot. But we should not, however, hand over the decision of who lives or who dies to a machine – this crosses an ethical red-line and will result in new weapons of mass destruction," Professor Walsh said.
The boycott, made up of researchers from 30 countries, came in advance of a meeting this week of 123 member nations of the UN in Geneva, discussing the challenges posed by lethal autonomous weapons (often called “killer robots”), known as the Group of Governmental Experts to the Convention on Certain Conventional Weapons.
Already, 22 of nations taking part have called for an outright and pre-emptive ban on such weapons.