The Defense Advanced Research Projects Agency is in the business of high technology, with hundreds of projects that span areas of focus -- cyber, biology, microsystems, along with many others. But one common goal among many of them: bring high-tech tools to the tactical edge and better enable troops in combat.

On the ground, programs like VirtualEye target improved situational awareness that put eyes where service members can't actually see – enemy-held buildings or territory, for example. By putting a pair of cameras on throwable robots that can be tossed into the contested space, VirtualEye uses the two cameras to create a 3-D image in real time that probes the room. The image can be moved around to focus on a specific area; users can pick their point of view, virtually seeing through and around obstructions that could otherwise create a dangerous blind spot.

VirtualEye's creators – DARPA officials working with tech company NVIDIA – hope to expand the program in the future as well.

"The goal is to move to multiple and mobile cameras to do larger areas," said Trung Tran, DARPA VirtualEye program manager. "Think about an urban area with multiple cameras – you could virtually combine them and virtually walk down the street."

Tran spoke at a demonstration of the program at DARPA's larger demo day at the Pentagon May 11.

VirtualEye takes the overload factor out of scenarios that can involve dozens of cameras, instead building the single, 3-D image that can be manipulated by the user.

"It's tough to look at 20 cameras simultaneously…it's much easier to virtually combine and create this 3-D image where you pick the point of view," said NVIDIA's Jan Kautz during a demo.

Tactical technologies in development aren't limited to the ground – they're also targeting sea, space, cyberspace and the air.

Target Recognition and Adaption in Contested Environments, or TRACE, is designed to reduce the cognitive load on fighter pilots that have to wade through data to find the right target, DARPA officials said. By pushing recognition to the tactical edge and employing machine-learning algorithms for complex radar imagery, the sensors do the heavy lifting.

"It improves the workflow of the pilot, adapting to new and emerging threats – and it pushes processing being done by ground cells out to the sensor," said John Gorman, TRACE program manager. He added that the time, resources and processing it takes those cells "make it impossible to go after time-sensitive targets."

The sensors can be on a Reaper or Predator unmanned aerial vehicle, or an F-35, F-22, F-18 or other aircraft, Gorman said. And while the sensors may be doing the analysis, it's humans making decisions, he noted.

But TRACE frees up pilots to do just that, but even better.

"It's like walking in traffic while texting – these guys should be looking at the threats," he said.

The four-year program is in its first nine months so far, but Gorman said he and his team already are looking to the future. That includes enlisting Lockheed Martin to build an open mission systems wrapper around the algorithms so they can be integrated into future systems, he said.