What if artificial intelligence and machine learning could automate and accelerate damage assessments taken from satellite imagery in the aftermath of a natural disaster?
That’s the question the Defense Innovation Unit is asking experts with it’s xView2 Challenge.
Satellite imagery is frequently used for damage assessments following natural disasters, from hurricanes to forest fires. For example, following Hurricane Dorian the National Geospatial-Intelligence Agency was able to use satellite imagery to tell responders what infrastructure was damaged, which roads were still operational and which airfields were ready to receive aid.
While satellite imagery can be produced rather quickly following a natural disaster, that data still needs to be processed by human analysts to determine what’s damaged and what’s still standing. Automating that process using machine learning algorithms could be key to getting that information to responders even faster.
Enter the xView2 Challenge.
xView2 is a follow up to xView1, a prize competition seeking computer vision algorithms capable of identifying objects in satellite images first responders might need, like a building. The xView2 challenge takes that concept a step further to a practical application. In addition to automatically locating a building in a satellite image, the new algorithms must also be able to assess what kind of damage the building has sustained, if any, in the image.
For the xView2 Challenge, participants will use a publicly available dataset of satellite imagery that includes images before and after six types of disaster, including wildfire, landslides, earthquakes and flood damage. The dataset includes 550,230 building annotations across 19,804 square kilometers spanning 10 countries. That dataset was released Sept. 19.
“DIU’s goal in hosting this challenge is to enlist the global community of machine learning experts to tackle a critically hard problem: detecting key objects in overhead imagery in context and assessing damage in a disaster situation,” Mike Kaul, DIU’s AI portfolio director, said in a statement announcing the challenge in August.
With help from the Pentagon’s Joint Artificial Intelligence Center, researchers with the Software Engineering Institute and several other government organizations, the Defense Innovation Unit created the xView2 Challenge as a prize competition to develop algorithms capable of identifying and labeling damage assessments from satellite imagery.
“The JAIC is helping to fund the challenge and helped develop the ideas and the concept for the computer vision challenge. Our Humanitarian Assistance and Disaster Relief (HA/DR) team will work closely with the DIU to assess the ideas and recommendations generated from this challenge,” said Lt. Cmdr. Arlo Abrahamson, a spokesman for the JAIC. “ The winning ideas solicited from the challenge will contribute to the advancement of the JAIC Humanitarian Assistance and Disaster Relief (HA/DR) national mission initiative in its work to support and enable the DoD and our agency partners with AI solutions in this important field of work.”
Submissions are due Nov. 22. A total of $150,000 in prize money is up for grabs, with prizes ranging from $1,000 to $37,950. All submissions will be eligible for follow-on acquisition opportunities.
CORRECTION: This article has been corrected with updated information about the xView2 dataset.
Nathan Strout covers space, unmanned and intelligence systems for C4ISRNET.