Great power competition is upon us. Our adversaries are prepared for sophisticated conflicts across multiple domains that employ denial of communications and information advantage capabilities. This fundamental shift in warfare requires a change in the way the U.S. military prepares and positions for conflict. Winning future conflicts depends on near real-time decision making informed by data that has no domain, organization or national boundaries.
In a recent speech, Chief of Naval Research Rear Admiral Lorin Selby said “data is the new oil. Software is the new steel.” This is a brilliant analogy for the importance of information and advanced software needed to fuel our future warfighter and achieve Joint All Domain Command and Control (JADC2). This sentiment was reinforced in the recently released Department of Defense (DoD) JADC2 Strategy.
As DoD continues to advance JADC2, it must contend with how to create a truly joint approach while avoiding the pitfalls of past attempts at joint systems. Over the past two decades, DoD has made multiple, largely unsuccessful attempts at creating joint systems. These systems were criticized for being monolithic, failing to meet service-specific requirements, and challenged to obtain widespread user adoption.
The fundamental flaw with previous attempts stemmed largely from culture and bureaucracy. The way services organize, train and equip their forces and how they compete for resources to do so resulted in individual and often duplicative efforts rather than interoperable, enterprise capabilities. In other words, each service created a new system or platform to solve their piece of the joint puzzle and then relied on the users – often at the COCOM and warfighting level - to stitch together a joint solution.
Lessons from attempts offer a pragmatic solution to the joint problem. The department should create a system that connects individual data streams and then pay for putting existing data into it and testing to be sure it works.
Such joint government-owned Application Programming Interfaces (APIs) would connect service-specific data, forming the foundation for command and control to rapidly communicate intel and directives to the war fighter. Bringing this concept to life will require DoD to take three actions: Create understandable standards that support connection of existing data; resource connection of the data to the system; and fund joint testing to make sure the connections work.
Create a set of joint C2 APIs and support services to build data fabrics and enable connection through the APIs.
DoD should develop a joint set of command and control APIs and empower services to build their own service data fabric solution and connect across domain with the joint APIs that allow data to flow between services. This approach enables each service to own and maintain the data fabric for their requirements while creating seamless interoperability across services. Without adherence to this approach, we will see history repeat itself. Already, services are beginning to build multiple data fabrics, even within each service. To date, we are tracking 10 data fabric solutions across DoD. Where possible, these should be limited to a single command and control data fabric per service for better interoperability and connection.
The Army’s Rainmaker Common Data Fabric prototype shows the promise of this type of API-driven connection, fusing the Air Force’s F-35 data with the Army’s tactical edge. This type of cross-service connection allows the services to maintain their own data fabrics for their unique needs while allowing connection for joint missions.
Create a framework of adoption through resourcing and governance for each service to plug in.
As the department seeks to unify the tech stack for the services, it is critical to create a framework for compliance. The framework should include the policy and regulatory backbone to drive service adoption – this could be modeled after the Modular Open Systems Approach (MOSA) requirements in the National Defense Authorization Act (NDAA) or the recent data decree from the Deputy Secretary, but extending guidance to the missing implementation details such as the APIs. Ideally, it would both incentivize adoption through alignment with funding (MOSA model) as well as implement guidance through policy directive.
To enable the framework from a policy perspective, DoD, with leadership from the Air Force as the Executive Agent for JADC2, should develop a set of Joint Command and Control APIs and a common reference architecture that allows other services to easily understand and achieve interoperability through the APIs. This approach is analogous to the TCP/IP specification that is the backbone of the internet today and was developed by DoD fifty years ago and called ARPANet.
Execute joint exercises to test interoperability.
It will take more than theory and technology to confirm the functionality of APIs under pressure. Starting with sensor to effector mission threads, the department should continue to robustly test how JADC2 works both across the services and with coalition partners. DoD leaders have noted that the Army’s Project Convergence exercise allowed DoD to take technology out of the lab and test it in live scenarios, To date, that exercise has had limited international participation. Extended technical integration with allies in Europe and the Pacific Rim will be critical in the years to come.
As noted in a recent article in Defense News, the interoperability between the Taiwanese and American forces is well understood by both the U.S. and our adversaries. Creating exercises that allow us to identify opportunities to connect DoD and coalition partners becomes that much more important given escalation of the great power competition.
As the department continues to grapple with the “how” of JADC2 and services extend both their data fabrics and JADC2 systems, now is the time to create the condition for cross-service data sharing.
Greg Wenzel is an executive vice president at Booz Allen Hamilton.