Skip to content
Apply
Stories

The case against killer robots

Noel Sharkey, the chairman of the Inter­na­tional Com­mittee for Robot Arms Con­trol, argued his case against killer robots last Friday at North­eastern Uni­ver­sity, saying that autonomous machines should not be allowed to make the deci­sion to kill people on the battlefield.

“We’re on a course toward fully automating war­fare,” Sharkey warned in a two-​​hour lec­ture on the polit­ical, eth­ical, and legal impli­ca­tions of robotic weapons. “Who in his right mind would auto­mate the deci­sion to kill?”

Later in the day, Sharkey mod­er­ated a panel dis­cus­sion of drones and killer robots. The pan­elists com­prised Max Abrahms, an assis­tant pro­fessor of polit­ical sci­ence at North­eastern; Denise Garcia, a member of ICRAC and an asso­ciate pro­fessor of polit­ical sci­ence and inter­na­tional affairs at North­eastern; and Patrick B. Johnson, a polit­ical sci­en­tist at the RAND Cor­po­ra­tion, a non­profit global policy think tank.

The two-​​part event—the second in a new series titled “Con­tro­ver­sial Issues in Secu­rity Studies”—was spon­sored by the North­eastern Human­i­ties Center and the Depart­ment of Polit­ical Sci­ence. Garcia orga­nized the pro­gram with the sup­port of Gerard Loporto, LA’73, and his family.

Sharkey, for his part, is a pre­em­i­nent expert in robotics and arti­fi­cial intel­li­gence. As a spokesperson for the Cam­paign to Stop Killer Robots, he trav­eled to Geneva ear­lier this month to con­vince the United Nations’ Con­ven­tion on Con­ven­tional Weapons to ban killer robots before they’re devel­oped for use on the bat­tle­field. Fully autonomous weapons, which do not yet exist, would have the ability to select and then destroy mil­i­tary tar­gets without human intervention.

The rise of the machines is a hot-​​button issue in Wash­ington. In response to crit­i­cism of the administration’s use of combat drones, Pres­i­dent Obama deliv­ered a speech at the National Defense Uni­ver­sity in May, promising that the U.S. would only use drones against a “con­tin­uing and immi­nent threat against the Amer­ican people.

“The ter­ror­ists we are after target civil­ians and the death toll from their acts of ter­rorism against Mus­lims dwarfs any esti­mate of civilian casu­al­ties from drone strikes,” he added. “So doing nothing is not an option.”

In his lec­ture last Friday, Sharkey laid out his argu­ment against the autonomous Terminator-​​like weapons. He began by noting that their use could vio­late at least two prin­ci­ples of inter­na­tional human­i­tarian law—the prin­ciple of dis­tinc­tion, which posits that bat­tle­field weapons must be able to dis­tin­guish between com­bat­ants and civil­ians; and the prin­ciple of pro­por­tion­ality, which posits that attacks on mil­i­tary objects must not cause exces­sive loss of civilian life in rela­tion to the fore­see­able mil­i­tary advantage.

Of the prin­ciple of pro­por­tion­ality, he said, “You can kill civil­ians pro­vided it’s pro­por­tional to direct mil­i­tary advan­tage, but that requires an awful lot of thinking and careful years of plan­ning. We must not let robots do that under any circumstance.”

Sharkey also cen­sured the CIA’s use of the nation’s cur­rent fleet of combat drones in coun­tries with which the U.S. is not at war. “I would like to ask the CIA to stop killing civil­ians in the name of col­lat­eral damage,” Sharkey pleaded. “I really don’t like seeing chil­dren being killed, because there’s no excuse for that whatsoever.”

In his opening remarks, Stephen Flynn, the director of Northeastern’s Center for Resilience Studies, artic­u­lated the dif­fi­cul­ties of rapidly assim­i­lating new war­fare tech­nology. “Tech­nology always out­paces our ability to sort out what the guide­lines are,” he explained. “What could be tac­ti­cally effec­tive could also be strate­gi­cally harmful.

“Issues of policy, tech­nology, and morality are all in play, but they don’t lend them­selves to slo­gans or bumper stickers,” he added. “We won’t have effec­tive con­ver­sa­tions unless we delve into these issues.”

In the Q-​​and-​​A ses­sion, more than a dozen stu­dents heeded Flynn’s advice by asking Sharkey sev­eral tough ques­tions. The former pres­i­dent of the North­eastern Col­lege Democ­rats asked Sharkey what stu­dents could do to stop the devel­op­ment of killer robots, prompting Sharkey to encourage stu­dents to start a youth move­ment to raise aware­ness of their dangers.

Another stu­dent asked Sharkey whether automating war­fare would decrease the human death toll. “I don’t mind pro­tecting sol­diers on the ground, but [the use of killer robots] might lead to more bat­tles than you want to be in,” he explained. “If they’re increasing ter­rorism, then who are they really protecting?”

– By Jason Kornwitz

More Stories

Photo of the Capitol Building at night

High stakes for politics, SCOTUS in 2018

01.04.2018
Photo of the crashed truck that was used in the October 31st attack in Manhattan.

Weaponizing Language: How the meaning of “allahu akbar” has been distorted

11.08.2017
Northeastern logo

Why I love studying Spanish

05.29.20
Uncategorized