The military thinks more may be better when it comes to unmanned aircraft flying intelligence missions. Instead of going in with a big, slow MQ-1 Predator or MQ-9 Reaper, maybe it would make sense to utilize swarms of small, cheap, nimble drones.
This scenario looks especially promising in urban settings, where tall buildings, tight spaces and narrow sightlines can complicate drone operations. But new issues arise with UAV swarms: how to control them effectively; how to fly dozens of machines at once in a smart, coordinated way?
DARPA is looking to address the question through its OFFensive Swarm-Enabled Tactics program, or OFFSET. The goal is to use virtual reality gaming and other techniques to develop 100 or more operationally effective swarm-piloting tactics.
Today's navigational techniques are clunky.
"We still are trying to drive these systems in the way we steer a car. We are watching sensor feeds, being called to attention when there is a problem," said Timothy Chung, DARPA program manager.
"Because of the sheer scale and complexity of these swarm systems, those conventional means will just not work. So we need to explore how to encode a tactical language, a tactical lexicon. We need new paradigms for how we interact with these systems," he said.
The question is more than merely hypothetical, as military leaders have made it clear that unmanned swarms will be a tool for future commanders.
DARPA’s Gremlins program is looking to develop a system that would allow aircraft to launch swarms of inexpensive, reusable UAVs. The Pentagon has demonstrated such a capability, though its potential uses remain uncertain.
At sea, the Office of Naval Research has shown off a capability for unmanned surface vessel swarms, and Air Force leaders have talked about purchasing 1,500 to 2,000 small UAS for ISR mission sets.
To meet all these ambitions, the military will first have to figure out how to operate the UAVs effectively. That’s where OFFSET comes in.
The human interface
Program leaders are wrangling with a number of questions as they look for a means to effectively maneuver future swarms, starting with issues surrounding the interface between the swarm and the human operator.
"Pilots" might control the swarm through gestural interface, through touch screen, joystick or voice command. Most likely the solution will involve some combination of inputs.
"Maybe a heat map or dots on the screen is appropriate in some cases, whereas if I want to split the swarm into sub-swarms, maybe that is more ‘logical’ than ‘spatial,’ so it would be left hand, right hand," Chung said. "There might be different ways to present information based on the types of tactics that are required."
Given the extreme complexity of the swarm ecosystem, planners may need to develop more novel forms of interface. "The richness of the interactions we wish to have with these swarms may merit more immersive technologies," he said. "That might be virtual reality, augmented reality, all the way to virtual assistants like Siri and Alexa, where you can communicate with them and they can provide you with tactical information."
Virtual reality will have a role to play on OFFSET, outside such questions of interface. As the team seeks out solutions, it will be developing a VR gaming environment in which users develop tactics for swarm manipulation in a simulated environment.
Technical constraints limit how swarms can be used in the real world, "but we still want to explore the possible combinations," Chung said. "It would be difficult for us to explore every permutation of a swarm system in the physical world, just because of the time and cost. So we see an opportunity to employ a game-based environment, a virtual environment, where we can insert and explore technologies in these swarm tactics."
This approach could give military planners insight into how best to use future swarms. For example, game drones could be rigged out with hypothetical sensors, giving them the ability to see through multiple sets of walls. If the results look promising, military planners might push development of such sensors for actual deployment.
"We want to look at the things we can look at physically, the things we can presently realize. But we also look at what we can do in this virtual setting, and then toggle between those to determine where we can best invest our development resources," Chung said.
The biggest challenge here is the need for simultaneous efforts in the development ecosystem. For swarms to fly, engineers need to work out the aviation aspect; software needs to evolve; the human interface has to be refined, and lots of heavy-duty computational and decision-making tools will have to be built. All this has to happen more or less at the same time.
"I have a glass full of water on a table that I am currently building. So I can’t just build one leg of the table, or the glass will tip. We need to build all the legs concurrently, in a holistic, systems-oriented approach," Chung said.







