logo
ResearchBunny Logo
Abstract
Capturing forceful interaction with deformable objects during manipulation benefits applications like virtual reality, telemedicine, and robotics. This paper presents a visual-tactile recording and tracking system for manipulation featuring a stretchable tactile glove with 1152 force-sensing channels and a visual-tactile joint learning framework to estimate dynamic hand-object states. An active suppression method based on symmetric response detection and adaptive calibration improves force measurement accuracy by 45.3% (to 97.6%). The learning framework processes visual-tactile sequences and reconstructs hand-object states, achieving an average reconstruction error of 1.8 cm across 24 objects from 6 categories.
Publisher
Nature Communications
Published On
Nov 04, 2024
Authors
Chunpeng Jiang, Wenqiang Xu, Yutong Li, Zhenjun Yu, Longchun Wang, Xiaotong Hu, Zhengyi Xie, Qingkun Liu, Bin Yang, Xiaolin Wang, Wenxin Du, Tutian Tang, Dongzhe Zheng, Siqiong Yao, Cewu Lu, Jingquan Liu
Tags
visual-tactile
manipulation
force measurement
tactile glove
joint learning framework
dynamic hand-object states
adaptive calibration
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny