If there is a consensus around artificial intelligence and its role on the battlefield, it was recently articulated by Brig. Gen. Frank Kelly, the deputy assistant secretary of the Navy for unmanned systems.

“An unmanned systems future is inevitable,” Kelly said at an AI conference in Washington March 7. “The magic wand has already been waved.”

The key question yet to be answered, according to experts who spoke at the conference, is how much autonomy should be allotted to unmanned robots powered by artificial intelligence.

“When you give a system autonomy, you are making a choice to give another agent the freedom to make decisions on your behalf”, said John Paschkewitz, a program director at Defense Advanced Research Agency’s defense sciences office. “You have to ask in every context ‘am I comfortable letting a machine make this decision on my behalf?’”

Paschkewitz advocated for using unmanned systems selectively, leveraging the technology to solve complex problems but only in situations with low risk in the case of failure.

“If you’re going to give your 16-year-old the keys to the car, do you give it to them on Friday night when you know they’re going out with their friends? Probably not,” Paschekewitz explained. “But on a Thursday morning? You’re probably more comfortable with that.”

It’s these concerns about autonomous robots and their ability to cause physical harm, which led the Department of Defense in 2012 to establish a formal doctrine mandating that only systems that maintain a “human in the loop” can be authorized to use lethal force.

Notably, China and Russia, the United States’ main competitors in the race to embrace military artificial intelligence, lack any such formal doctrine, national security experts said March 7. This difference in policy raised questions at the conference about whether the advantages gained by adversaries’ with more liberal approaches toward unmanned attack systems might put pressure the United States and others to embrace higher degrees of autonomy on the battlefield, and potentially abandon the human in the loop policy.

Share:
More In Unmanned