Over the past several years, the U.S. Department of Defense and the military services have spent enormous amounts of time extending weapons ranges and creating new sensors, algorithms and hardware to support the next generation of warfighting. With these new capabilities, interconnectivity complexities increase exponentially, frequently due to siloed invention and disparate integration.
Because operations environments are not pristine and uniform across all formations, these variants create havoc on G2 intelligence gathering and G6 cyber operations teams during exercises and combat situations. The department needs to focus on data logistics, specifically connectivity and hierarchy, to succeed.
Data is the lifeblood of any digital transactional or analytical system. Getting data to systems or sensors becomes a critical pathway to success for any digital organization. For this article, data logistics are all efforts necessary to move data from source systems to their destination.
Sometimes this means many endpoints based on the type of data. Other times this means moving data back to the source for further data gathering or messages that create actions. While in detail, data logistics is different from physical logistics, at abstract levels, they are similar.
A promising approach is Zhamak Dehghani’s “data mesh.” Principles defined in data mesh include domain-oriented decentralized data ownership and architecture, data as a product, self-serve data infrastructure as a platform, and federated computational governance. These principles align with Program Executive Office’s independent structure.
However, appropriate enterprise technologies must exist for data mesh to be successful. These include architectural patterns, data catalogs, change data capture, immutable audit logs and data product APIs. At the core is the event streaming backbone, which shuttles data to interested systems and applications.
Commercial cloud providers, well entrenched in the DoD, already have the capabilities integrated into their offerings. AWS offers Kinesis and Azure provides Stream Analytics, to name two. One industry standard is Kafka, an open-source project or managed service by cloud providers or Confluent.io. Many companies have adopted Kafka due to its in-stream processing capabilities that provide real-time analytics for rapid decision-making and event triggers for humans and machines.
Private industry has solved many problems related to the rapid onboarding of new sensors and sites. The use of automation from networks to applications will enable the Services and OSD to rapidly create the logical integrations necessary to achieve Joint All-Domain Command and Control, or JADC2. As units move to support combatant commands, unit assets need to reduce the time to connectivity. Additionally, there needs to be a focus on automating connections between data distribution nodes and any relevant data backbones.
DoD needs to press on a technical strategy requiring current and future systems to achieve discoverability. An example is the human presence function on messaging systems reused and applied to HIMARS, F-35s, Abrams and TPQ-53s.
The Army’s Unified Network Operations has made positive efforts to improve connectivity timelines. A key opportunity exists with zero-trust security. By divesting the network edge, services at every echelon can become interoperable at a more flexible technical level.
This will require unified efforts across the DoD. Doing so will synchronize internal technical efforts and policies that will onboard systems and sensors and allow, for example, an F-35 and HIMARS to talk directly to each other without human intervention.
Another focus necessary to enable rapid data logistics is rearchitecting the hierarchical distribution model. According to Conway’s Law,
“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization’s communication structure.” — Melvin E. Conway
A shift in thought and design for data distribution architectures is necessary to enable rapid decision-making by humans and systems. The latency between systems and sensors will be critical, even when robust networks and computing exist.
Flattening these architectures will also reduce digital choke points for moving inter-echelon and intra-echelon. Designing our systems to replicate a flat and interconnected organizational model is necessary for minimizing these risks.
For the Pentagon to enable data for decision-making by a human, sensor, or system, the DoD must focus on the data logistics. The DoD must reduce the time for connections at the network, data, application, and system levels. Using industry-proven technologies and architectures enables warfighters and decision-makers, whether human or machine, by improving resiliency in the entire architecture.
Jock Padgett is the Chief Data Officer and Senior Digital Advisor at the XVIII Airborne Corps, US Army. During the Afghanistan withdrawal and Ukraine support mission, he served as the Chief Technology Officer at the 82nd Airborne Division.
The opinions expressed are his and do not represent the Department of Defense’s position or approach.
Have an Opinion?
This article is an Op-Ed and the opinions expressed are those of the author. If you would like to respond, or have an editorial of your own you would like to submit, please email C4ISRNET Senior Managing Editor Cary O’Reilly.