IT and Networks

The Pentagon wants big-data analytics in every rucksack

In the (near) future, troops on the ground will check their kit to make sure everything is accounted for as they head into battle, and the most critical tool will be data access.

That is, at least, what some future-focused tech officials at the Pentagon say, pointing to the growing importance of cloud capabilities and access to data at the tactical edge. To take on the increasing appetite for data, officials are pursuing pilot programs and working diligently on tools like much-needed algorithms that make intelligence products from the volumes of data pouring in from sources like unmanned aerial systems’ video feeds.

“There are easier problems to start with than creating an algorithm for full-motion video footage … but I’m surprised by how good we’re doing,” said Dr. Travis Axtell, Office of the Under Secretary of Defense (Intelligence), speaking Dec. 12 at an online event held by Amazon Web Services.

“We’re showing we can augment the analyst and give them new functionality for things they’re doing over several hours and now can do in a handful of minutes. And now we want to start applying this in a historical context … not just monitoring feeds.”

The Navy is pursuing a handful of pilots aimed at better making sense of huge volumes of data, as well as making that data available and usable in tactical environments where access to the cloud — where all that data is stored — isn’t a given.

In one case, the service is working with the National Geospatial-Intelligence Agency to move massive data sets, in some cases terabytes and exobytes — “data so large that if I tried to stream it over a satellite connection, it would take weeks and months,” said Andy Farrar from the Digital Warfare Office in the Office of the Deputy Chief of Naval Operations for Information Warfare.

“The Navy is looking how to do it smartly so we can take new algorithms … to take data to tactical edge and bring it back, including large data sets,” Farrar said.

One major challenge is the use of legacy systems, or even systems that just aren’t optimized for big-data analytics, Farrar said. He noted a growing focus on data tagging and collecting the “right” kind of data.

“Just collecting really good data is a challenge. As different systems grew up they didn’t take data into account separate from the rest of systems,” he said. “We’re finding now that as we try to look at that data and apply analytics … your data is never as good as you think it is.”

Jim Caggy, senior manager for Department of Defense solutions at Amazon Web Services, said AWS is working on different efforts that extends cloud capabilities, data analytics and intelligence to tactical environments.

“You can’t forklift a data center and drop it into combat zone,” Caggy said. He highlighted efforts such as AWS’ Snowball platform, which offers 100 terabytes of storage and base compute functionality, and AWS GreenGrass, with which the company has done pilot programs on ground vehicles to track telemetry data. He also said AWS is working on a mobile app where data can be queried locally — such as at sea — but all the data gets synchronized back to the cloud.

Recommended for you
Around The Web
Comments