The cars parked on the side of the road are clear as day, and then they flicker, and then they are gone. Just a few cars at a time, a minivan here, a sedan there, always in close proximity. The flicker moves and then it vanishes, following the narrowest focus of the camera’s moving lens. It is, at once, an art project, an activist message, and a technology demonstration. Augmented reality is what we make of it.
Created by software developer Chris Harris, the project is an object lesson in what distorted reality can look like under the right display. Dubbed “Biophillic Vision - Experiment 1,” Harris’ stated goal for the project is to showcase what it would be like to experience a city without cars. The reasons are both aesthetic and environmental, inspired in part by the threat of climate change and also a personal distaste for cars in cities.
The ease and low-cost involved in transforming a display of the world has implications for military projects. It’s part of the impetus (and the tension) behind the Pentagon’s $480 million worth of interest in Microsoft’s HoloLens and some workers objections to military use of HoloLens. Augmented reality tools are commonly seen as mapping new features onto existing terrain, but what if they could also do the opposite? What if augmented reality was used to reveal features hidden by existing terrain?
The areas where the augmented processing struggles the most are vehicles that don’t conform to existing expectations. Trained on a similar data set, a military heads-up display could flag technical vehicles as distinct from passenger vehicles, identify up-armored cars or unusual load configurations. Applied to a difference formation, and with, say, a database of trees, the program could provide an immediate visual way to isolate unusual objects in a forest from the natural foliage, and display how the unusual object would stand in a clearing.
What’s fascinating is the way it creates real-time image distortions, with the code processing cars and backgrounds and generating new images to replace them. At the present level of the technology, the effect is a shimmer, a flicker in and out of being. It makes the viewer feel like the reality they are observing through the camera is an overtaxed video game, unable to hold together its entire virtual reality.
To make the effect work, Harris adapted open-source code. The first is vehicle detection, and the second one does image completion. With roughly a day of work putting it together, Harris had an AI that produces images at roughly 2 frames-per-second, and in his write-up of the project, he speculates that more time could make it run in real time on high-end mobile devices.
Augmented reality is growing in the military and government market. In addition to the Microsoft contract, BAE Systems is working on an augmented-reality system onboard a Royal Navy warship as part of a £20 million ($27 million) investment in advanced combat systems technology. Augmented reality and virtual reality is expected to be a multi-billion dollar industry for the government by 2021.
Since the tools are open source, Harris’ video provides a proof of concept for anyone interested in an augmented reality tool that can isolate objects in the lived environment, and illustrate ways around them. More haunting, tools that erase objects from digital displays could be put to malicious purpose, as machines that rely on vehicle detection see instead a clear path.
“Biophillic Vision - Experiment 1” feels like cyberpunk, a technological mutation on an existing frame of reference. In its glitchy vision is a world of possibility, one that military planners and designers would do well to understand, if not iterate upon.
Kelsey Atherton blogs about military technology for C4ISRNET, Fifth Domain, Defense News, and Military Times. He previously wrote for Popular Science, and also created, solicited, and edited content for a group blog on political science fiction and international security.