Sometimes soldiers earn a battlefield advantage by seeing things differently. Night vision goggles, for instance, give fighters an edge by extending their sight beyond the normal visual realm.

But scientists at Harris Corp. say they are making headway on another type of enhanced visualization. “Hyperspectral” imaging looks beyond ordinary colors to see the unique electromagnetic energy emitted by every object or material — whether solid, liquid, or gas.

“Every manmade or natural material has a signature that can be unique. Hyperspectral allows us to see that, even in very small samples,” said Greg Terrie, a principal scientist from the Harris geospatial solutions team. “If someone hides a tank in the woods, all I need to do is get a pixel on that item. This technology will allow me to see things I could not see with the naked eye or even with a broadband sensor.”

The military has been watching developments in hyperspectral for some time. As early as 2005, Army documents predicted this type of sensing could be used to enhance robotic navigation, allowing unmanned vehicles to cover ground “more safely with increased speed.”

The technology has made strides in recent years. “There’s an emerging market for hyperspectral sensors in general,” an official from NIST’s Physical Measurement Laboratory said in a Navy news release in November 2017. “They’re becoming more sophisticated, and this is a component to help them be a more robust product in an increasingly competitive market.”

The Air Force too has taken an interest, with officials noting that hyperspectral “can provide joint force planners with current information on sub-surface, surface, and air conditions” and thus become more valuable on the battlefield.

Battlefield uses

As hyperspectral sensors get smaller, cheaper and easier to use, Terrie said he can envision a number of potential battlefield applications. For situational awareness, this type of sensing could be used to provide commanders a more granular view of the battlefield.

“It could be different roof types, wood or metal, or it could be different road types, concrete or asphalt. This data allows me to separate those things at a fine level of detail. Not only is it a wood roof, it is a southern pine roof. Those are things you can’t do with other kinds of sensors,” he said.

Perhaps a soldier at a distance cannot discern the shape of a vehicle: Is it a minivan or a combat vehicle? What the human eye can’t make out, even with magnification, hyperspectral sensing could identify based on the material composition of the object.

“You can only see a normal image, but each pixel in that image that is loaded with data that we can explore and identify,” Terrie said. “Even if that item is mixed it with other things, as long as I can pick out that signature, I know that item is present.”

Recent advances have shrunk detectors considerably: Where past iterations weighed hundreds of pounds, today’s sensor may weigh just a couple of pounds. This opens up possibilities for military implementations.

In the past, “these systems could only go in very big aircraft,” Terrie said. “With the onset of miniaturization, we see already these flying in UAVs on the commercial side. This affords the possibility of putting more sensors out there, with more coverage and better access to data.”

While the reduced form factor could be a boon to military users, researchers still are fine-tuning the capability. They’re turning to artificial intelligence, machine learning and other sophisticated processing techniques to ensure that battlefield intel is as accurate as possible. In the short term this likely will mean running hyperspectral sensors in tandem with other data nodes.

“The hard part is that even with the progress in the algorithmic approaches, there are still false alarms,” Terrie said. “We can’t yet say with 100 percent accuracy that we are detecting what we think we are seeing. Things can look like each other, so you have to corroborate that information with other factors that increase your confidence. There is still some interpretation that has to occur.”

Share:
More In Sensors