logo
ResearchBunny Logo
Neural space-time model for dynamic multi-shot imaging

Biology

Neural space-time model for dynamic multi-shot imaging

R. Cao, N. S. Divekar, et al.

Discover how Ruiming Cao and colleagues have developed a groundbreaking neural space-time model (NSTM) that enhances computational imaging by jointly estimating scene dynamics and motion without prior data. This innovative approach minimizes motion artifacts and enables precise motion dynamics recovery, particularly in advanced microscopy techniques.

00:00
00:00
~3 min • Beginner • English
Abstract
Computational imaging reconstructions from multiple measurements that are captured sequentially often suffer from motion artifacts if the scene is dynamic. We propose a neural space-time model (NSTM) that jointly estimates the scene and its motion dynamics, without data priors or pre-training. Hence, we can both remove motion artifacts and resolve sample dynamics from the same set of raw measurements used for the conventional reconstruction. We demonstrate NSTM in three computational imaging systems: differential phase-contrast microscopy, three-dimensional structured illumination microscopy and rolling-shutter DiffuserCam. We show that NSTM can recover subcellular motion dynamics and thus reduce the misinterpretation of living systems caused by motion artifacts.
Publisher
Nature Methods
Published On
Sep 24, 2024
Authors
Ruiming Cao, Nikita S. Divekar, James K. Nuñez, Srigokul Upadhyayula, Laura Waller
Tags
neural space-time model
computational imaging
motion dynamics
microscopy
motion artifacts
raw measurements
subcellular dynamics
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny