Multi-modal mining of crowd-sourced data: Efficient provision of humanitarian aid to remote regions affected by natural disasters

Sadegh Khanmohammadi, Emad Golafshani, Yu Bai, Heng Li, Milad Bazli, Mehrdad Arashpour

Research output: Contribution to journalArticleResearchpeer-review


Data mining applications have the potential to address current deficiencies in the provision of humanitarian aid in natural disasters. Simultaneous text and image analysis in crowd-sourced data can improve the quality of humanitarian aid information. Specifically, we select Bidirectional Encoder Representations from Transformers (BERT) and its descendant ALBERT as pre-trained deep networks for the text modality, while we choose ConvNeXt, RegNet, and Faster RCNN for the image modality. The developed framework demonstrates its application in classifying humanitarian aid through three key aspects. Firstly, it illustrates the effective performance of ConvNeXt and BERT in the classification of humanitarian aid. Secondly, it investigates the efficiency of generative adversarial networks (GAN) in generating synthetic images for imbalanced input datasets. This approach improves the accuracy, precision, recall, and F1-score of the framework when applied to unseen test data. Finally, the study highlights the potential use of SHapley Additive exPlanations (SHAP) for interpreting the behaviour of the developed framework, supporting the timely classification of humanitarian aid information from crowd-sourced data after natural disasters.

Original languageEnglish
Article number103972
Number of pages14
JournalInternational Journal of Disaster Risk Reduction
Publication statusPublished - 1 Oct 2023


  • Artificial intelligence (AI)
  • Data mining
  • Deep neural networks (DNN)
  • Generative adversarial networks (GAN)
  • Humanitarian aid
  • SHapley additive exPlanations (SHAP)

Cite this