Accurate sensor calibration is crucial for autonomous systems, yet its uncertainty quantification remains underexplored. We present the first approach to integrate uncertainty awareness into online extrinsic calibration, combining Monte Carlo Dropout with Conformal Prediction to generate prediction intervals with a guaranteed level of coverage.
Our method proposes a framework to enhance existing calibration models with uncertainty quantification compatible with various network architectures. Validated on KITTI (RGB Camera-LiDAR) and DSEC (Event Camera-LiDAR) datasets, we demonstrate effectiveness across different visual sensor types, measuring performance with adapted metrics to evaluate the efficiency and reliability of the intervals.
By providing calibration parameters with quantifiable confidence measures, we offer insights into the reliability of calibration estimates — greatly improving robustness of sensor fusion in dynamic environments.
- First uncertainty-aware online calibration: integrates Monte Carlo Dropout with Conformal Prediction for guaranteed coverage intervals
- Architecture-agnostic framework: compatible with any existing online calibration network
- Multi-sensor validation: evaluated on both RGB Camera-LiDAR (KITTI) and Event Camera-LiDAR (DSEC)
- Novel evaluation metrics: adapted metrics for efficiency and reliability of uncertainty intervals
@InProceedings{Cocheteux_2025_WACV,
author = {Cocheteux, Mathieu and Moreau, Julien and Davoine, Franck},
title = {Uncertainty-Aware Online Extrinsic Calibration:
A Conformal Prediction Approach},
booktitle = {Proceedings of the Winter Conference on
Applications of Computer Vision (WACV)},
month = {February},
year = {2025},
pages = {6167-6176}
}