Unpatched software can create an imminent security threat for the average user, let alone the military. So imagine that when an update is issued, it requires burning a CD, mailing it out and manually applying to each system.

This sounds like a less-than-desirable scenario from the 1990s, but it’s the reality of how the Army still handles some of its software patching for units operating away from post, camp and station. That’s changing, however, according to Army officials who say adapting previously developed technologies could help tackle the issue of quickly needed software updates.

Across the Army, units increasingly are able to forgo the manual application in favor of automated processes that update software. Martin said his office has been working with Army Communications-Electronics Command to automate the processes for a number of systems; he added that they’ve also put all of the patches on a shared portal for units to easily access instead of receiving the disks in the mail.

“In some cases, when they’re deployed tactically, the networks are not as robust as post, camp and station, so now we’re using things like the Global Broadcast System [GBS] … that can bring volumes of data to the tactical space to enable these patches to go anywhere in the world, no matter where they’re deployed,” said Gary Martin, program executive officer – command, control, communications-tactical.

“We’ve made a lot of progress in that area, and that will significantly unburden the small signal staff that’s in a brigade today that would have to do this routinely, because with all the software we have, we’re constantly patching systems.”

But the idea is to make the process even more seamless, including through the Warfighter Information Network-Tactical, he said.

“We’re deploying this year a rapid provisioning system, which is software that will enable the system to much more quickly update all of the nodes inside a brigade. Initially it will be through a wired connection in a motor pool, but on the heels of this will be the ability to do it over the air,” Martin said.

“We are now extending that same solution to support patching of our mission command service. Because when you’re out in the field, the command post server and all the mission command apps are actually connected to WIN-T. So being able to do things over the air and push them using a common provisioning system for everything in the tactical space makes that much easier.”

The Army in November ran a test to see if it could deliver a rapid, simulated mass software update through the Global Broadcast System and WIN-T. While GBS isn’t a new program — it’s been in the portfolio since 2005 — this was the first time it was used for a mass software update. Traditionally it’s been used for large-volume data and intelligence such as imagery, biometric data and live-streaming video from unmanned aerial vehicles, according to an Army release.

“We need to be innovative, and look to other services to see what they are doing, and then come up with new ways to use systems that we already in our inventory to improve capability and network security,” said Lt. Col. Jenny Stacy, product manager for satellite communications at PEO C3T.

“GBS is already an Army program of record, so there is minimal cost and maximum benefit to leveraging the system to not only increase our cyber posture, but to use the process as an alternative to using CDs or DVDs to disseminate software updates to deployed units.”

According to the Army, the GBS software patch pilot test was conducted with support from the Pennsylvania Army National Guard’s 2nd Brigade Combat Team, 28th Infantry Division, at the unit’s armory in Washington, Pennsylvania, in early November. During the test, the team successfully verified the Army’s ability to use GBS to electronically distribute security module files by using the system to send Windows update modules to an at-the-halt WIN-T Increment 1 virtual server stack.

Feedback from the November pilot run will be used in a follow-on test set to take place in a lab environment, which will push larger volumes of data to test the limits of GBS. If all goes well, additional testing likely will occur before the system potentially is rolled out on a wider basis.