logo
ResearchBunny Logo
Deep learning-based robust positioning for all-weather autonomous driving

Engineering and Technology

Deep learning-based robust positioning for all-weather autonomous driving

Y. Almalioglu, M. Turan, et al.

Dive into the innovative world of autonomous vehicle technology with groundbreaking research by Yasin Almalioglu, Mehmet Turan, Niki Trigoni, and Andrew Markham. This study introduces a robust, deep learning-based method for ego-motion estimation under adverse weather conditions, integrating visual and radar data to enhance safety and reliability in all environments.

00:00
00:00
Playback language: English
Abstract
Autonomous vehicles (AVs) require precise localization for safe operation, which is challenging under adverse weather. This paper proposes a deep learning-based self-supervised approach for robust ego-motion estimation, fusing visual and radar data using an attention mechanism. The method predicts reliability masks to mitigate sensor deficiencies, demonstrating robust all-weather performance and cross-domain generalizability. A game-theoretic approach analyzes model interpretability, revealing independent failure modes. This work advances AVs towards safe and reliable all-weather autonomous driving.
Publisher
Nature Machine Intelligence
Published On
Sep 08, 2022
Authors
Yasin Almalioglu, Mehmet Turan, Niki Trigoni, Andrew Markham
Tags
autonomous vehicles
ego-motion estimation
deep learning
sensor fusion
reliability masks
adverse weather
cross-domain generalizability
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny