Even as the U.S. Army attempts to integrate cutting edge technologies into its operations, many of its platforms remain fundamentally in the 20th century.

Take tanks, for example.

The way tank crews operate their machine has gone essentially unchanged over the last 40 years. At a time when the military is enamored with robotics, artificial intelligence and next generation networks, operating a tank relies entirely on manual inputs from highly trained operators.

“Currently, tank crews use a very manual process to detect, identify and engage targets,” explained Abrams Master Gunner Sgt. 1st Class Dustin Harris. “Tank commanders and gunners are manually slewing, trying to detect targets using their sensors. Once they come across a target they have to manually select the ammunition that they’re going to use to service that target, lase the target to get an accurate range to it, and a few other factors.”

The process has to be repeated for each target.

“That can take time,” he added. “Everything is done manually still.”

On the 21st century battlefield, it’s an anachronism.

“Army senior leaders recognize that the way the crews in the tank operate is largely analogous to how these things were done 30, 45 years ago,” said Richard Nabors, acting principal deputy for systems and modeling at the DEVCOM C5ISR Center.

“These senior leaders, many of them with extensive technical expertise, recognized that there were opportunities to improve the way that these crews operate,” he added. “So they challenged the Combat Capabilities Development Command, the Armaments Center and the C5ISR Center to look at the problem.”

On Oct. 28, the Army invited reporters to Aberdeen Proving Ground to see their solution: the Advanced Targeting and Lethality Aided System, or ATLAS.

ATLAS uses advanced sensors, machine learning algorithms and a new touchscreen display to automate the process of finding and firing targets, allowing crews to respond to threats faster than ever before.

“The assistance that we’re providing to the soldiers will speed up those engagement times [and] allow them to execute multiple targets in the same time that they currently take to execute a single target,” said Dawne Deaver, C5ISR project lead for ATLAS.

At first glance, the ATLAS prototype the Army had set up looked like something out of a Star Wars film, albeit with treads and not easily harpooned legs. The system was installed on a mishmash of systems — a sleek black General Dynamics Griffin I chassis with the Army’s Advance Lethality and Accuracy System for Medium Calibur (ALAS-MC) auto-loading 50mm turret stacked on top.

And mounted on top of the turret was a small round Aided Target Recognition (AiTR) sensor — a mid-wave infrared imaging sensor to be more exact. Constantly rotating to scan the battlefield, the sensor almost had a life of its own, not unlike an R2 unit on the back of an X-Wing.

Trailing behind the tank and connected via a series of long black cables was a black M113. For this demonstration, the crew station was located inside the M113, not the tank itself. Cavernous compared to the inside of an Abrams tank, the M113 had three short seats lined up. At the forward-most seat was a touchscreen display and a video game-like controller for operating the tank, while further back computer monitors displayed ATLAS' internal processes.

Of course, ATLAS isn’t the tank itself, or even the M113 connected to it. The chassis served as a surrogate for either a future tank, fighting vehicle or even a retrofit of current vehicles, while the turret was an available program being developed by the Armaments Center. The M113 is not really meant to be involved at all, but the Army decided to remotely locate the crew station inside of it for safety concerns during a live fire demonstration expected to take place in the coming weeks. ATLAS, Army officials reminded observers again and again, is agnostic to the chassis or turret it’s installed on.

So if ATLAS isn’t the tank, what is it?

Roughly speaking, ATLAS is the mounted sensor collecting data, the machine learning algorithm processing that data, and the display/controller that the crew uses to operate the tank.

Here’s how it works:

ATLAS starts with the optical sensor mounted on top of the tank. Once activated, the sensor continuously scans the battlefield, feeding that data into a machine learning algorithm that automatically detects threats.

Images of those threats are then sent to a new touchscreen display, the graphical user interface for the tank’s intelligent fire control system. The images are lined up vertically on the left side of the screen, with the main part of the display showing what the gun is currently aimed at. Around the edges are a number of different controls for selecting ammunition, fire type, camera settings and more.

By simply touching one of the targets on the left with your finger, the tank automatically swivels its gun, training its sights on the dead center of the selected object. As it does that, the fire control system automatically recommends the appropriate ammo and setting — such as burst or single shot — to respond with, though the user can adjust these as needed.

So with the target in its sights, weapon selected, the operator has a choice: Approve the AI’s recommendations and pull the trigger, adjust the settings before responding, or disengage. The entire process from target detection to the pull of the trigger can take just seconds. Once the target is destroyed, the operator can simply touch the screen to select the next target picked up by ATLAS.

In automating what are now manual tasks, the aim of ATLAS is to reduce end-to-end engagement times. Army officials declined to characterize how much faster ATLAS is than a traditional tank crew. However, a demo video shown at Aberdeen Proving Ground claimed ATLAS allows “the operator to engage three targets in the time it now takes to just engage one.”

ATLAS is essentially a marriage between technologies developed by the Army’s C5ISR Center and the Armaments Center.

“We are integrating, experimenting and prototyping with technology from C5ISR center — things like advanced EO/IR targeting sensors, aided target algorithms — we’re taking those technology products and integrating them with intelligent fire control systems from the Armaments Center to explore efficiencies between those technologies that can basically buy back time for tank crews,” explained Ground Combat Systems Division Deputy Director Jami Davis.

Starting in August, the Army began bringing in small groups of tank operators to test out the new system, mostly using a new virtual reality setup that replicates the ATLAS display and controller. By gathering soldier feedback early, the Army hopes that they can improve the system quickly and make it ready for fielding that much faster. Already, the Army has brought in 40 soldiers. More soldier touchpoints and a live fire demonstration are anticipated to help the Army mature its product.

In some ways, ATLAS replicates the AI-capabilities demonstrated at Project Convergence in miniature. Project Convergence is the Army’s new campaign of learning, designed to integrate new sensor, AI and network capabilities to transform the battlefield. In September, the Army hauled many of its most advanced cutting edge technologies to the desert at Yuma Proving Ground, then tried to connect them in new ways. In short, at Project Convergence the Army tried to create an environment where it could connect any sensor to the best shooter.

The Army demonstrated two types of AI at Project Convergence. First were the automatic target recognition AIs. These machine learning algorithms processed the massive amount of data picked up by the Army’s sensors to detect and identify threats on the battlefield, producing targeting data for weapon systems to utilize.

The second type of AI was used for fire control, and is represented by FIRES Synchronization to Optimize Responses in Multi-Domain Operations, or FIRESTORM. Taking in the targeting data from the other AI systems, FIRESTORM automatically looks at the weapons at the Army’s disposal and recommends the best one to respond to any given threat.

While ATLAS does not yet have the networking components that tied Project Convergence together across domains, it essentially performs those two tasks: It’s AI automatically detects threats and recommends the best response to the human operators. Although the full ATLAS system wasn’t hauled out to Project Convergence this year, the Army was able to bring out the virtual prototyping setup to Yuma Proving Ground, and there is hope that ATLAS itself could be involved next year.

To be clear: ATLAS is not meant to replace tank crews. It’s meant to make their jobs easier, and in the process, much faster. Even if ATLAS is widely adopted, crews will still need to be trained for manual operations in case the system breaks down. And they’ll still need to rely on their training to verify the algorithm’s recommendations.

“We can assist the soldier and reduce the number of manual tasks that they have to do while still retaining the soldiers' ability to always override the system, to always make the final decision of whether or not the target is a threat, whether or not the firing solution is correct, and that they can make that decision to pull the trigger and engage targets,” explained Deaver.

Nathan Strout covers space, unmanned and intelligence systems for C4ISRNET.

Share:
More In Artificial Intelligence