The U.S. military is producing more data at the tactical edge, where remote sensors continually capture data feeds. It’s also consuming more data at the edge, where data has greatest value.

Trouble is, the plural of “data” isn’t “intelligence.” Simply having a lot of data doesn’t mean anyone’s gaining actionable insights from it. Plus, missions need to act on information across domains – land, sea, air, space and cyberspace. To achieve that goal, they need a way to process, analyze and share disparate datatypes.

The good news is that proven technology solutions, from APIs to containers to automation, can help the DoD transform raw data into actionable intelligence in the field and in real time.

JADC2 and Five V’s of Big Data

The Pentagon’s Joint All-Domain Command and Control initiative emphasizes the strategic importance of data sharing. In the past, the United States benefited from weapons platforms that enabled battlespace dominance. In the future, dominance will be achieved through effective coordination of data across branches and allies.

Achieving dominance through data calls for the military to address the “five V’s” of big data:

Volume – Missions require the compute and networking power to generate, share and consume large data volumes at the edge.

Velocity – They also need to process and analyze data rapidly if they want the information they generate to be accurate and actionable.

Variety – They have to integrate data from multiple, siloed systems and present it to decision-makers in a consistent, usable format.

Veracity – They need data to be accurate, which includes being timely.

Value – This “V” could also stand for “visualization.” For data to have value, missions need to present it in formats that enable accurate decisions and fast action.

Localized, Standardized, Actionable

Traditionally, missions gathered data into a “data lake” for storage and analysis. Today, capturing data at the edge, sending it to a centralized location for processing, and then returning information to the edge for consumption simply takes too long. If data is needed in milliseconds but is only available in minutes, it’s no longer trustable.

Instead, missions need to process data right where it will be used. So, in addition to data lakes, they need smaller data ponds and even smaller data pools, plus the ability to move data upstream and downstream. Fortunately, significant compute power is now available in small form factors. Analysis that previously had to occur in a datacenter can now be achieved on lean systems at the edge.

Weapons systems in each domain were designed and built within silos. To share data across domains, missions also need to standardize that data.

Previously, missions performed centralized data integration, translating every incoming datatype into every outgoing datatype. But if you need to integrate 100 datatypes, say, you end up with close to 5,000 permutations. Start adding more datatypes, and the permutations grow exponentially.

The solution is data interchange based on an application program interface (API). At the point of production, a sensor generates data in its own proprietary format and translates it into a standardized API format. At the point of consumption, an edge system receives data in the API format and translates it into its own proprietary format. That way, only two translations are ever required, regardless of the number of datatypes.

Applications and Automation at Scale

Three other technologies can aid the military in its quest to share data across missions:

Containerization – Containerization combines an application and any associated code and data in a single package, or container. This approach enables you to quickly and easily move an application from one computing environment to another.

Kubernetes – Kubernetes is an open-source system for orchestrating multiple containers. Kubernetes can enable the military to rapidly deploy applications that generate and consume data in the cloud and in a lightweight framework at the edge. A key strength of containers and Kubernetes is the ability to dynamically spin up and take down applications and their associated data as missions change. And if an application process fails, Kubernetes can immediately launch a new instance of it so that missions aren’t interrupted.

IT automation platform – With multi-domain operations, the available data assets change from one conflict to the next. So, missions must be able to dynamically configure networks and integrate changing components across those networks. But it’s simply too complex and time-consuming to do that manually.

An IT automation platform offers an effective solution, and the technology to achieve it is available today. A military-scale enterprise framework for automated IT deployment and operation can extend from the cloud to the edge. Each command can deploy the field systems and components it needs, and automatically integrate with a standard architecture or API. That way, commands can maintain their unique field capabilities while supporting the broader, standardized whole.

Ambitious Defense Department initiatives such as JADC2 involve challenges. But APIs, containers and automation can position missions to benefit from actionable intelligence at the edge and in real time. By doing so, they can help the military combine the strengths of each service to achieve data dominance in the battlespace.

Christopher Yates is DoD Army chief architect for Red Hat, a Raleigh, North Carolina-based software company that provides open source products to companies and governments.

More In Cyber