“What happens next?”
That’s a fundamental problem facing military leaders every day, often followed up by “what are the repercussions of a given action?” and “if our unit makes this move, how will that play out?”
These are the big picture dilemmas that defense technologists believe they can begin to answer. “We are not building tools that will predict the future,” said Steve Jameson, a program manager in DARPA’s Information Innovation Office. “But we are building tools that will give a range of likely outcomes, that will allow the user to explore their options and will give some explanation: Here’s why this particular approach will yield this outcome.”
In tactical terms, such information would be worth its weight in gold. That’s why DARPA is mining for that gold through a $4.2 million Phase 1 contract awarded by the U.S. Air Force Research Laboratory to BAE Systems.
Formally known as the Causal Exploration of Complex Operational Environments, DARPA’s project seeks to assimilate wide swaths of previously inaccessible information in order to develop more complete intel and predict potential outcomes.
“It’s intended the underpin the relatively new doctrinal concept of operational design. That’s an activity aimed at developing deeper insight around an operational environment, understanding root causes of the scenario and developing a suitable operational approach,” said Jonathan Goldstein, Senior Principal Scientist, Autonomy, Controls, and Estimation group at BAE Systems. BAE’s proposed solution goes by the acronym CONTEXTS, short for Causal Modeling for Knowledge Transfer, Exploration, and Temporal Simulation.
Military leaders typically have plenty of sources of information, but they are often unable to ferret out useful intel in a timely way.
“There may be intelligence reports, government and NGO databases, statistics that can be analyzed. There are subject matter experts who may have knowledge,” Goldstein said. “The challenge is that new crises can pop up very fast, and you need to come up with a deep, nuanced understanding of the operational environment in a short amount of time.”
Scientists on this project want to pull together that diverse data in a graphical interface, an at-a-glance type system that would allow commanders to adjust key variables in order to see how outcomes might be affected.
“Each approach that you take alters the model. If you increase security in this area you can see how that affects other parts of the model. If you focus on the economy, you can see how that impacts the model,” Goldstein said. “It’s a way of considering the most appropriate approach.”
DARPA is looking to such “causal exploration” to enable military leaders to navigate their way through increasingly intricate situations.
“In the complexity of today’s operational environment, often it is not easy to tell what the military objective is that we should be targeting, or even whether there is a military objective,” Jameson said. “We can see situations where we have unanticipated outcomes. We attempt to solve a problem but we don’t address all the underlying causes, and so the problem comes back or we pick the wrong problem to address.”
In the emerging solution, algorithms would parse data looking for key indicators: The local level of violence, sentiment toward the government, services available to the citizenry. All this gets plotted and charted, with mathematical models representing the likely interdependencies between the variables.
“They would literally click on a node in the graph and press the arrow up to increase this variable or decrease that one. You get the outcomes in the form of a graph, or you can get the outcome in a narrative form saying this outcome was reached, or that outcome was not reached,” Goldstein said.
The exact form of the outputs is still under development, but the aim is to give commanders a view that is clearly legible and easy to digest. The key here is to present commanders with workable alternatives.
“The understanding that the commander gets from using the tool will be incredibly important,” Goldstein said. “The point is not just to suck in data and create a great model. It’s about making that information understandable, modifiable and transparent. The user really owns that model. That’s where the value lies.”
Causal exploration could help commanders to cut through the uncertainty, but that may not happen for some time yet. BAE’s four-year effort is slated to deliver in 2021.