A house burnt feels like an isolated tragedy. The victims, at least those who survive, know that it happened, and the arsonist knows that it happened, but in the remote spaces far from recourse the house burns, and the world moves on unaware.
We live in an age of unblinking mechanical eyes mounted in space and pointed toward the planet at all times. The charred remnants of the fire are proof enough and, with the right training, humans reading the footage can find the atrocities the satellites saw.
In a guide shared today at online open-source investigation house Bellingcat, researcher Benjamin Strick details how people can use satellite footage and tools to process it to find the evidence fire leaves behind. It’s an invaluable guide for other researchers working in the open source, and also suggests a path for any company interested in developing tools that use machine learning to parse satellite imagery for an intelligence product.
Strick pairs Google satellite images of a suspected fire sight with Sentinel Hub, which includes infrared imaging. Demonstrating the technique first on a wildfire site in California, and then repeating for burned villages in Nigeria and Myanmar.
While the infrared footage doesn’t get as close to granular data, it’s a great tool for seeing the widespread devastation in local vegetation around a burn. That makes it great for checking if damage is actually fire damage, or perhaps just a weird artifact in the photograph.
As a further check to make sure the fire spotted on footage is in fact evidence of violence, Strick recommends matching the identified location to any new reports in the area. To automate this process into an intelligence product, and not just a set of tools for an open source researcher, a company might want to build on that.
Tasks like this are the proverbial low-hanging fruit for building an AI that can assist the National Geospatial-Intelligence Agency, which increasingly serves as a clearinghouse for this kind of imagery. If a computer can be trained on evidence of known village burnings to identify those same patterns in new footage as its received, then such tragedies could be spotted, recorded and even responded to almost in real-time. In a best-case scenario, they may even be predicted.
An algorithm that automatically matches locations mentioned in news with the satellite footage both before and after the date in the story, and cross-checked against infrared footage, could analyze the globe at scale, putting tools in the hands of humans for a further round of quality checking before reporting on the fires uncovered from the heavens.
Kelsey Atherton blogs about military technology for C4ISRNET, Fifth Domain, Defense News, and Military Times. He previously wrote for Popular Science, and also created, solicited, and edited content for a group blog on political science fiction and international security.