Abstract:
We address the challenge of how a robot can adapt its actions to successfully manipulate objects it has not previously encountered. We introduce Real-time Multisensory Af...Show MoreMetadata
Abstract:
We address the challenge of how a robot can adapt its actions to successfully manipulate objects it has not previously encountered. We introduce Real-time Multisensory Affordance-based Control (RMAC), which enables a robot to adapt existing affordance models using multisensory inputs. We show that using the combination of haptic, audio, and visual information with RMAC allows the robot to learn afforance models and adaptively manipulate two very different objects (drawer, lamp), in multiple novel configurations. Offline evaluations and real-time online evaluations show that RMAC allows the robot to accurately open different drawer configurations and turn-on novel lamps with an average accuracy of 75%.
Date of Conference: 20-24 May 2019
Date Added to IEEE Xplore: 12 August 2019
ISBN Information: