To get to the AI future it wants, the Pentagon needs to start planning for that reality now.
Long-reaching acquisitions are nothing new for the Department of Defense; the wars of tomorrow are always fought using the acquisitions choices of years and, often, decades prior. To master the utility behind autonomy, the Pentagon plans to turn to the Joint Artificial Intelligence Center, or JAIC.
“The JAIC is not just about delivering the products. We’re really trying to work toward becoming the DoD’s AI Center of Excellence,” said Nathaniel D. Bastian, a senior data scientist and AI engineer with JAIC. “We want to be 1-800-AI.”
Bastian’s remarks came at the 2019 AI and Autonomy symposium of the Association of the United States Army in November. While his background is as an adversarial AI researcher, JAIC itself is not in the business of exploring the frontiers and applications of artificial intelligence. That’s a space best left to the Defense Advanced Research Projects Agency, the Intelligence Advanced Research Projects Activity and the service labs.
Instead, Bastian described JAIC as a connector, a way to match existing research with new applications and new technologies. The concept involves taking what is known now and applying it to concepts and technologies that will be viable and useful sometime in the next four to 20 years.
“It’s one thing to talk about upskilling our current workforce with the skills needed to build AI tools and to use them,” Bastian said. “We’re also looking ahead 10 and 15 years from now, when we reached and achieved our vision of a DoD transformed by AI, what does that look like? What do we need? Are there going to be AI crew chiefs that maintain and deploy it?”
AI is, at heart, an advanced form of software that grants autonomy to machines or processes. This breadth of applications is evident in JAIC’s main lines of effort, mandated by the DoD AI strategy. With a scope of projects ranging from software acquisition to workforce development to ethics, the organization has a heavy workload.
What will shape AI the most is how the agency chooses to tackle specific problems within that scope. This is where the direct experience of practitioners hired by the JAIC will matter.
“So how do we build robustness into our algorithms that are taking data that can be easily poisoned? It’s very important that we get that right.”
JAIC is focused on bridging the gap between research and implementation, ensuring that the tools it recommends, develops and passes along are assets, not liabilities.
Two examples capture the scope of that work and its immediate applications.
First, in the area of humanitarian research and disaster relief, JAIC is working on full-motion video firefighter detection for drones.
“You have a drone overhead to manage the spread of fire, and they want to predict fire accuracy. Where do you think that fire’s going to spread,” Bastian.
With such a drone in place, the officials managing the firefight can discover how to best allocate staffer to shifting needs thanks to the input and assessment made by the drone. That’s AI in a practical sense. Essentially, it is data informing a robot of where to direct humans.
Bastian also pointed to AI in the health field through suicide prevention.
“We’re looking at image medical imagery analysis, as well as helping to try to predict unfitting conditions for soldiers, they can do medical readiness specific supports, we can do quicker backfills and also get them into our pre-disability disability evaluation system,” Bastian said.
Making sure the software can augment the ability of humans, be it in firefighting or a doctor’s office, is the fundamental goal going forward.
There’s one more goal as well: making sure the AI aimed at helping isn’t in fact being manipulated through error or malice to cause harm.
Kelsey Atherton blogs about military technology for C4ISRNET, Fifth Domain, Defense News, and Military Times. He previously wrote for Popular Science, and also created, solicited, and edited content for a group blog on political science fiction and international security.