The U.S. Army is revising its artificial intelligence bill of materials effort following meetings with defense contractors.

The service last year floated the idea of an AI BOM, which would be similar to existing software bills, or comprehensive lists of components and dependencies that comprise programs and digital goods. Such practices of transparency are championed by the Cybersecurity and Infrastructure Security Agency and other organizations.

The Army is now pivoting to an “AI summary card,” according to Young Bang, the principal deputy assistant secretary for acquisition, logistics and technology. He likened it to a baseball card with useful information available at a glance.

“It’s got certain stats about the algorithm, its intended usage and those types of things,” Bang told reporters at a Pentagon briefing April 22. “It’s not as detailed or necessarily threatening to industry about intellectual property.”

The Department of Defense is spending billions of dollars on AI, autonomy and machine learning as leaders demand quicker decision-making, longer and more-remote intelligence collection and a reduction of human risk on increasingly high-tech battlefields.

More than 685 AI-related projects are underway across the department, with at least 230 being handled by the Army, according to a Government Accountability Office tally. The technology is expected to play a key role in the XM30 Mechanized Infantry Combat Vehicle, formerly the Optionally Manned Fighting Vehicle, and the Tactical Intelligence Targeting Access Node, or TITAN.

The goal of an AI BOM or summary card is not to reverse engineer private-sector products or put a company out of business, Bang said.

Rather, it would offer greater understanding of an algorithm’s ins and outs — ultimately fostering trust in something that could inform life-or-death decisions.

“We know innovation’s happening in the open-source environment. We also know who’s contributing to the open source,” Bharat Patel, a project lead with the Army’s Program Executive Office for Intelligence, Electronic Warfare and Sensors, told reporters. “So it goes back to how was that original model trained, who touched that model, could there have been poisons or anything?

Additional meetings with industry are planned, according to the Army.

Colin Demarest was a reporter at C4ISRNET, where he covered military networks, cyber and IT. Colin had previously covered the Department of Energy and its National Nuclear Security Administration — namely Cold War cleanup and nuclear weapons development — for a daily newspaper in South Carolina. Colin is also an award-winning photographer.

More In AI & ML