The Defense Department’s Office of the Chief Digital and Artificial Intelligence Officer (CDAO) expects to publish guidance in the next two months to help implement the department’s new Data Analytics and A.I. Adoption Strategy, which codifies a shift to a decentralized approach to AI and machine learning operations to take advantage of industry advancements the past few years.

The forthcoming guidance will not look like a traditional implementation plan but rather a set of best practices to outline patterns of how to carry out the new strategy, Craig Martell, the chief digital and AI officer, told reporters last Thursday following the release of the document.

“Each of the services have wildly different needs and they’re at wildly different points in their journey, and they have wildly different infrastructure,” Martell said. “So, we’re going to insist on patterns of shareability, patterns of accessibility, patterns of discoverability. And how those are implemented we’re going to allow a lot of variances for.”

Andrew Peppler, the senior strategy and policy analyst for the Office of the CDAO, said the reason for the best practices approach to the implementation plan is “because we need to think differently about agile approaches to adoption.”

Martell’s office oversees overseeing implementation of the strategy. He said there are different timelines for different goals, citing long-lasting “marquee use cases” such providing a “dashboard” so that the deputy secretary of defense can routinely assess metrics associated with AI adoption.

A second use case is around Combined Joint All Domain Command and Control (CJADC2), which will be enabled by the data integration layer, he said.

“And we’re using that data integration layer really as a pathfinder demonstration of the right way to get data right, make it accessible and get it to the places it needs to be timely,” Martell said. “With respect to CJADC2, it really is about the combatant commanders getting the right situational awareness to make the right decisions quickly.”

The strategy puts a focus on the “DoD AI Hierarchy of Needs,” a five-level pyramid founded on quality data and then a governance layer to ensure the data is trusted. The bottom levels point to the need for quality, trusted data, which is the linchpin to analytics and AI.

“If you do not have high quality data, AI is a fool’s errand,” Martell said. Still, he added, there will be an “iterative approach” to improving data quality over time.

The “insightful analytics and metrics” layer is next and requires the domain expertise of respective DoD leaders, according to the strategy.

“And analytics is in that layer because I think I’ve said this publicly before, but you know, 60 to 75 percent of the use case demands that I’ve seen for AI aren’t really demands for AI, they’re demands for high quality data and visibility into what the data is saying, i.e., analytics,” Martell said.

“Responsible AI” tops the pyramid and refers to “the Department’s dynamic approach to the design, development, and use of AI capabilities that is consistent with the DoD AI Ethical Principles,” the 26-page strategy says.