Loading...
Disaster Detection on the Fly: Optimized Transformers for UAVs
Jankovic, Branislava ; Jangirova, Sabina ; Ullah, Waseem ; Khan, Latif U. ; Guizani, Mohsen
Jankovic, Branislava
Jangirova, Sabina
Ullah, Waseem
Khan, Latif U.
Guizani, Mohsen
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Disaster management and recovery are one of the crucial tasks in today's world. However, in many countries, disaster management still relies on human intervention, which can present a significant challenge, particularly in remote or inaccessible regions where timely intervention is required. To mitigate these problems, advances in photogrammetry and remote sensing such as unmanned aerial vehicles (UAVs), which incorporate embedded platforms and optical sensors, need to be employed. The proposed approach allows onboard aerial image processing, and avoids network reliability, data security, and response time issues. However, problems caused by the limited hardware resources of UAVs must be addressed. Many existing real-time disaster detection solutions rely on lightweight convolutional neural networks (CNNs) specifically tailored to classify a limited set of disaster scenarios. However, such frameworks often struggle in real-world situations, where the diversity of disaster cases and the limited capacity of low-complexity models hinder accurate differentiation. This work presents a UAV-powered edge computing framework for disaster detection, utilizing our proposed transformer-based deep learning model optimized for real-time aerial image classification. The optimization was done using post-training quantization techniques. Moreover, we employ Explainable AI (XAI) techniques to enhance interpretability and visually highlight the regions the model focuses on when making predictions. To address the limited number of disaster cases in existing benchmark datasets and ensure real-world adoption of our model, we create a novel dataset, DisasterEye, containing various disaster scenes captured by UAVs and ground-level cameras. Our practical results reveal the efficacy of the proposed solution on both traditional and resource-limited devices. We reduce inference time and memory usage without compromising the model's accuracy on all benchmark datasets. Finally, the effectiveness of the presented system highlights that it can be used as a powerful solution for many realtime remote sensing applications on resource-constrained UAV platforms. The code and DisasterEye dataset are available at: https://github.com/Branislava98/TensorRT.
Citation
B. Jankovic, S. Jangirova, W. Ullah, L. U. Khan and M. Guizani, "Disaster Detection on the Fly: Optimized Transformers for UAVs," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, doi: 10.1109/JSTARS.2025.3596681
Source
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Conference
Keywords
Remote Sensing, Unmanned Aerial Vehicles, Edge Inference, Image Classification, Optimization, Resource-Constrained Devices, Real-Time
Subjects
Source
Publisher
IEEE
