WASHINGTON — Future versions of U.S. Army reconnaissance helicopters will need trained aviators to operate them well into the next decade despite advances in artificial intelligence, according to a study conducted by Mitre Corp. for service leaders.
Full-fledged autonomy would fail to “faithfully” fulfill more than three-quarters of studied tasks associated with the Army’s in-development Future Attack Reconnaissance Aircraft, or FARA, by 2030, according to the technical analysis, details of which were recently shared with C4ISRNET.
The odds aren’t much better in 2040, either. At least 10 “high-risk” and 18 “medium-risk” challenges hampering no-pilot deployment were identified, suggesting human input — in the actual advanced rotorcraft, or beamed in from afar — will continue to be relied upon for complex, high-stakes military endeavors.
Maj. Gen. Walter Rugen, the director of the Army’s Future Vertical Lift Cross-Functional Team, said the findings will help determine how development money is spent.
“They’re very informative to a policy guy like me, that has to decide where our investments go,” Rugen said at a February event hosted by Mitre, which manages federally funded research and development centers.
The team Rugen leads is tasked with helping overhaul the Army’s aging airborne fleet, among other heavy lifts. The portfolio includes FARA, the Future Long-Range Assault Aircraft, or FLRAA, and future tactical unmanned aircraft systems and air-launched effects.
The Army in December selected Textron’s Bell unit to build FLRAA, a $1.3 billion deal that marked the service’s largest helicopter procurement in 40 years. The choice has since been protested by Lockheed Martin’s Sikorsky. A ruling from the Government Accountability Office is expected no later than April 7.
A contractor has not yet been selected to formally build FARA, which has earned the “knife fighter” moniker and is planned to succeed the Kiowa scout helicopter, retired nearly a decade ago. AH-64 Apache attack helicopters paired with Shadow UAS are filling the gap now.
The deep-dive conducted by Mitre, which tapped into development documents, academic publications and Army metrics and relied on interviews with air cavalry, “really helps us define” what’s possible in the near- and mid-terms, Rugen said.
“What I’ve kind of seen is, in many respects, the soldier still is our best sensor,” he added. “The soldier at the tactical edge is going to be quicker through the mid-term, through that 2030 time, than the computer.”
While uncrewed drones are well-equipped for what Rugen called “dull, dirty or dangerous” work — circling and forever staring, or probing chemically contaminated spaces — something like FARA is meant for more sophisticated tasks, applications that demand finesse, expertise and in-the-moment judgement.
“As we look at our drones, we’re talking about an extension of our sensor. And we’re talking about, in this report, really the hardest thing we do on the battlefield, which is fight for information,” Rugen said. “Reconnaissance is our toughest thing that we’re doing. And it’s hard to outsource that, certainly in the mid-term, to some autonomous agent.”
Among the factors that bar an empty FARA cockpit from reality are immature perception, decision-making and intent-determination capabilities, according to the assessment.
The trio are incredibly important to get right, and get right every time, according to John Wurts, a senior autonomous systems engineer at Mitre.
“The question becomes the importance of a reconnaissance mission,” he said at the same event where Rugen spoke. “We talked about information, and what’s most important to a reconnaissance mission is coming against the commander’s intent: understanding what are your reconnaissance objectives, what merits the threshold of reporting, what do you need report about either allied troops, enemy troops, terrain information, and at what points in the mission?”
Training reliable AI requires massive amounts of time, data and exposure. It’s difficult enough on the civilian side: this is a stop sign, this is a bus, this is the quickest route home. Things only get more complex in a military setting, when bullets are whizzing by, people are dying and the choices presented are not binary.
“A human can express their own tactical curiosity, understand second- and third-order effects, understand how to adapt to an adversary action,” said Wurts, who previously worked in the auto industry. “When we ask for a no-pilot configuration to operate the same set, that needs to all reside in the autonomy.”
Striking a balance may be the key.
Autonomy in the cockpit
As the Air Force increasingly hypes manned-unmanned teaming and seeks 1,000 so-called collaborative combat aircraft to swell its ranks, and the Navy envisions a future fleet teeming with uncrewed vessels, so too is the Army looking at ways of augmenting its troops with computer-powered might.
“If we’re going to posit that we do want inhabited cockpits, but we want more autonomy in those cockpits, I think that’s where we’re seeing some tremendous technology,” Rugen said. “Again, when we talk about some of the limits, we really see the machine not having the curiosity that humans do — what makes us human.”
The Defense Department considers AI a modernization priority and has invested in it, though the exact sum is unclear. AI is often a slice of a larger program, and classified activities can muddy the disclosure waters.
More than 685 AI projects, including some associated with major weapons systems, were underway at the Pentagon as of early 2021, the most up-to-date tally, the Government Accountability Office said.
The Army, the largest military service, is leading the pack. At least 232 efforts can be traced back to it, according to the federal watchdog. The Marine Corps, on the other hand, is dealing with at least 33.
AI is expected to aid target recognition aboard the Army’s Optionally Manned Fighting Vehicle, or OMFV, help sort and send information beamed to its Tactical Intelligence Targeting Access Node, or TITAN, and underpin the navigation of robotic combat vehicles, or RCVs, designed for scouting and escorting.
The technology is also being used to streamline logistics and offload monotonous, time-consuming or finicky tasks. The Army in September selected BigBear.ai for a $14.8 million contract to roll out the service’s Global Force Information Management system, designed to give service leaders an automated and holistic view of manpower, equipment, training and readiness. In October, the service picked Palantir for a separate $85.1 million predictive modeling software contract to get ahead of maintenance needs.
Army Chief of Staff Gen. James McConville in late February told reporters conflicts would be waged increasingly by a combination of man and machine. And for FARA, that is more likely the case: a crew assisted by digital prowess, increasing performance and reducing the chance of sensory overload.
“When I look at manned-unmanned teaming, that’s going to even become more prevalent,” McConville said at a Defense Writers Group event. “It’s going to be unmanned-manned teaming on the ground, in the air and really a combination of both, and it’s going be ubiquitous throughout the battlefield.”
The Army’s fiscal 2024 budget request, totaling $185.5 billion, sets aside $283 million for AI.
The funding, budget documents state, would cover research and development “for enhanced autonomy experimentation” as well as AI-enabled activities tied to OMFV, TITAN, RCVs and info-processing.
Defense News reporter Jen Judson contributed to this article.
Colin Demarest is a reporter at C4ISRNET, where he covers military networks, cyber and IT. Colin previously covered the Department of Energy and its National Nuclear Security Administration — namely Cold War cleanup and nuclear weapons development — for a daily newspaper in South Carolina. Colin is also an award-winning photographer.