Adaptive Multispectral Object Detection (AMOD) leverages the fusion of RGB and thermal imagery to enhance the performance of object detection models, particularly in varying environmental conditions and distance of the camera from the object. This is part

Authors

  • Aryav Gogia Department of Geography and Geoinformation Science, George Mason University, Fairfax, VA
  • Edward J. Oughton Department of Geography and Geoinformation Science, George Mason University, Fairfax, VA
  • James Gallagher Department of Geography and Geoinformation Science, George Mason University, Fairfax, VA

Abstract

Adaptive Multispectral Object Detection (AMOD) leverages the fusion of RGB and thermal imagery to enhance the performance of object detection models, particularly in varying environmental conditions and distance of the camera from the object. This is particularly applicable to overhead footage collected by different visual modality cameras on a drone (RGB, thermal etc.). By dynamically adjusting the scaling and transparency between video frames, AMOD can effectively address challenges posed by lighting variations, occlusions, and thermal inconsistencies. The fusion process integrates the complementary strengths of RGB and thermal data: RGB provides high-resolution texture and color information, while thermal imagery offers robust detection in low-light and obscured environments. This research investigates the impact of varying scaling and transparency on object detection model performance within this multispectral framework. By modifying the transparency levels, the model can emphasize either RGB or thermal data based on the environmental context, enhancing feature extraction and object localization. Scaling adjustments enable the alignment of features across different resolutions, ensuring coherent fusion and reducing artifacts. This research highlights the potential of AMOD in applications requiring reliable object detection across varied operational contexts, such as military surveillance, search and rescue, and autonomous navigation.

Published

2024-10-13

Issue

Section

College of Science: Department of Geography and Geoinformation Science