- Researchers from Barcelona and California, led by the Institute of Economic Analysis (IAE-CSIC) and the UAB, and with the participation of the Computer Vision Center (CVC), have applied machine learning to detect the destruction of buildings by artillery using neural networks.
- This automated method would make possible to monitor the destruction of a bellicose conflict, almost in real-time, aiming to improve humanitarian response.
This method, developed in a project co-led by the IAE-CSIC and the UAB, and with the participation of the CVC researcher Dr. Joan Serrat is based on neural networks that have been trained to detect in satellite images characteristics of heavy weapons (artillery and bombing) destructive attacks, such as the debris of collapsed buildings or the presence of bomb craters.
In the study, which results are published in the journal Proceedings of the National Academy of Sciences (PNAS), the scientists have applied this method to monitor the destruction of six of Syria’s main cities (Aleppo, Daraa, Deir-Ez-Zor, Hama, Homs, and Raqqa), plagued by an armed conflict for more than ten years. The results show that this method has a high efficiency at monitoring. “Our approach can be applied to any populated area as long as repeated high-resolution satellite imagery is available”, explained the authors.
Including the time factor
“An essential element of the development is that the neural network superimposes and compares successive images of the same place, contrasting them on a timeline that always includes a first image before the war. Another novelty is the incorporation of spatial and temporal information, in other words, information that gives context to the observation of destruction. In addition, the tool incorporates a novel method of image labeling: the system can make reasonable assumptions using the contextual information and train the algorithm with the destruction information around a building”, said the IAE-CSIC researcher Dr Hannes Mueller, lead author of the article
Automated methods must be able to detect destruction in a context where the vast majority of images do not appear to be of destruction. However, they often interpret buildings as demolished that actually are not, resulting in a high false-positive rate (FPR).
For even in Aleppo, a heavily war-torn city, only 2.8% of all images of populated areas contain a building that was confirmed as destroyed by the United Nations Operational Satellite Applications Program (UNOSAT) in September 2016, where they do a manual classification.
Low accuracy is “a very serious problem. Even in cities heavily hit by conflict, only 1% of buildings are destroyed. “Hence, their detection is like looking for a needle in a haystack. If we have false positives in the images, the margin of error shoots up quickly. In this case, 20% accuracy, for example, means that if an algorithm says something is destroyed, only 20% of what it says is actually destroyed,” continued the IAE-CSIC scientist.
The study demonstrates that the trained algorithm is able to identify damage in areas of Aleppo city that are not part of the UNOSAT analysis. It also provides evidence that this method can identify shelling in all six cities.
The results of this work are promising. They enable applications for the detection and even near real-time monitoring of destruction by war conflicts.
“Our method is particularly well suited to take advantage of the increasing availability of high-resolution imagery. We have estimated that human manual labeling of our entire dataset would cost approximately $200,000, and additional image repetitions would increase this cost almost proportionally. With an automated method such as ours, the benefits are numerous. More frequent imaging helps improve accuracy and the additional cost is small”, concluded Dr. Joan Serrat, CVC and UAB researcher
In the media
Reference: Monitoring war destruction from space using machine learning. Hannes Mueller, Andre Groeger, Jonathan Hersh, Andrea Matranga, Joan Serrat, PNAS June 8, 2021 118 (23) e2025400118; https://doi.org/10.1073/pnas.2025400118