The Pentagon on Thursday released its first enterprise-wide data strategy that aims to set a new course for capabilities and standards to better manage massive amounts of data needed to enable future joint all-domain operations.

The new strategy has been in development for over a year and lays out specific goals for going after cloud-enabled data architectures, improving use of data for training AI algorithms and a new approach to recognizing data “as a strategic asset.”

Aerial of the Pentagon, the Department of Defense headquarters in Arlington, Virginia, near Washington DC, with I-395 freeway on the left, and the Air Force Memorial up middle.

Dave Spirk, DoD’s chief data officer, led the effort to put together the strategy, which he said in a statement is meant “to acknowledge that data is the ammo of the future.”

Air Force Gen. John Hyten, the Vice Chairman of the Joint Chiefs of Staff, said last month the senior Pentagon leadership wants to solve the department’s data management challenges by 2030 (Defense Daily, Sept. 24). 

A defense official told reporters the strategy is aimed at compressing that timeline by focusing on new principles and capabilities for machine-to-machine data processing to rapidly solve current data challenges.

“Righteously, over the last year attention and demand for near-real time data-driven decision making has increased from our senior leaders in the Pentagon to our most tactical, lethal formations deployed today in harm’s way around the world. It’s not easy, and delivering the capability requires focus, prioritization and investment,” a defense official said. “As we look to migrate to enterprise cloud, conduct operations in contested and complex environments across multiple domains, as we have ever more sensors and IT systems, having effective data management is vital.”

The strategy is considered a critical piece to enabling the future Joint All Domain Command and Control (JADC2) concept, which aims to synthesize data from weapon systems across the service to enable multi-domain operations.

The document includes pushing for future capabilities to be developed with data interoperability as a key factor to ensure JADC2 can be operationalized.

“When new data gaps are identified, the data governance community must work with mission area managers to determine whether changes are needed to hardware; software; tactics, techniques, and procedures; or risk acceptance. The mitigation of many legacy systems is not cost-effective, making it imperative that all future systems are procured with data interoperability, software upgradability, and cloud-readiness as requirements,” officials wrote in the document.