Autonomous vehicles (AVs) require precise localization for safe operation, which is challenging under adverse weather. This paper proposes a deep learning-based self-supervised approach for robust ego-motion estimation, fusing visual and radar data using an attention mechanism. The method predicts reliability masks to mitigate sensor deficiencies, demonstrating robust all-weather performance and cross-domain generalizability. A game-theoretic approach analyzes model interpretability, revealing independent failure modes. This work advances AVs towards safe and reliable all-weather autonomous driving.
Publisher
Nature Machine Intelligence
Published On
Sep 08, 2022
Authors
Yasin Almalioglu, Mehmet Turan, Niki Trigoni, Andrew Markham
Tags
autonomous vehicles
ego-motion estimation
deep learning
sensor fusion
reliability masks
adverse weather
cross-domain generalizability
Related Publications
Explore these studies to deepen your understanding of the subject.