Multimodal Sensory Learning for Real-time, Adaptive Manipulation [article]

Ahalya Prabhakar, Stanislas Furrer, Lorenzo Panchetti, Maxence Perret, Aude Billard
2021 arXiv   pre-print
Adaptive control for real-time manipulation requires quick estimation and prediction of object properties. While robot learning in this area primarily focuses on using vision, many tasks cannot rely on vision due to object occlusion. Here, we formulate a learning framework that uses multimodal sensory fusion of tactile and audio data in order to quickly characterize and predict an object's properties. The predictions are used in a developed reactive controller to adapt the grip on the object to
more » ... compensate for the predicted inertial forces experienced during motion. Drawing inspiration from how humans interact with objects, we propose an experimental setup from which we can understand how to best utilize different sensory signals and actively interact with and manipulate objects to quickly learn their object properties for safe manipulation.
arXiv:2110.04634v1 fatcat:vxhsby4fqncq5gxc7eklhsuekm