logo
ResearchBunny Logo
Non-line-of-sight imaging with arbitrary illumination and detection pattern

Engineering and Technology

Non-line-of-sight imaging with arbitrary illumination and detection pattern

X. Liu, J. Wang, et al.

This groundbreaking research by Xintong Liu, Jianyu Wang, Leping Xiao, Zuoqiang Shi, Xing Fu, and Lingyun Qiu introduces a novel Bayesian framework for non-line-of-sight imaging, allowing for high-quality reconstructions even with irregular measurement patterns, which vastly broadens real-world applications.

00:00
00:00
~3 min • Beginner • English
Abstract
Non-line-of-sight (NLOS) imaging aims at reconstructing targets obscured from the direct line of sight. Existing NLOS imaging algorithms require dense measurements at regular grid points over a large relay surface, limiting applicability in practical scenarios (robotic vision, autonomous driving, rescue, remote sensing). This work proposes a Bayesian framework for NLOS imaging without constraints on the spatial pattern of illumination and detection points. By introducing virtual confocal signals, we develop a confocal complemented signal-object collaborative regularization (CC-SOCR) algorithm enabling high-quality reconstructions. The approach reconstructs both albedo and surface normals with fine details under general relay settings. With a regular relay surface, coarse rather than dense measurements suffice, significantly reducing acquisition time. Experiments demonstrate that this framework substantially extends the application range of NLOS imaging.
Publisher
Nature Communications
Published On
Jun 03, 2023
Authors
Xintong Liu, Jianyu Wang, Leping Xiao, Zuoqiang Shi, Xing Fu, Lingyun Qiu
Tags
Bayesian framework
non-line-of-sight imaging
CC-SOCR algorithm
virtual confocal signals
high-quality reconstructions
irregular measurement patterns
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny