Jurrian Doornbos
Multispectral UAV Image calibration and alignment
2025-03-06
Calibration procedures for various Multispectral sensors and UAVs. Used to create the large, open, pretraining dataset MSUAV (link to dataset will follow).
This repository serves as a starting point to work with the raw images from your MS sensor. (Parrot Sequioa, DJI 3M, DJI Phantom 4MS, MicaSense Altum (-pt), MicaSense RedEdge) But it is not an exhaustive library for immediate import into your own projects.
There are various sensors calibration options here, from raw sensor to aligned image. Every notebook starts with the specific sensor used. Some datasets are built around an orthomosaic, which can be ignored for the purposes of learning the specific sensor alignment.
notebooks/mavic_3m.ipynb
notebooks/phantom_4m.ipynb
notebooks/altum.ipynb
notebooks/rededge.ipynb
notebooks/sequoia.ipynb'First, install Conda. Then create an environment called msdata like below:
conda create --name msdata jupyterlab opencv numpy pandas rioxarray pillow piexif matplotlib tifffile scikit-image
conda activate msdata
The provided code contains two important components for processing multispectral drone imagery: image alignment and radiance calculation.
Most code in the notebooks is scaffolding around loading the imagery as seperate bands, processing the specific folder/sensor structure and working with the EXIF tags for reading camera metadata. The core idea however is:
The align_images function is responsible for aligning different spectral bands of drone imagery. This alignment is necessary because multispectral cameras often have slight physical offsets between sensors for different wavelengths, resulting in misaligned images.
Caching Previous Transformations:
Feature Detection and Matching:
Transformation Calculation:
Image Warping:
This process ensures that pixels in different spectral bands correspond to the same physical locations on the ground, every sensor has a different location of the spectral image. alignment is based on the Green image.
The irradiance function converts raw digital numbers from drone images into calibrated reflectance values. This process, often called radiometric calibration, is crucial for obtaining scientifically meaningful measurements.
Black Level Subtraction:
Exposure Time Normalization:
Gain Correction:
Irradiance Unit Scaling:
Sunshine Sensor Correction:
Final Reflectance Calculation:
This radiometric calibration process transforms raw sensor data into standardized reflectance values that can be directly compared across different flights, times of day, and atmospheric conditions - essential for scientific applications like agricultural monitoring, environmental assessment, and temporal change detection.
Licensed under MIT License, see LICENSE file.
This project is funded by the European Union, grant ID 101060643.