ABSTRACT
In robot programming by demonstration dealing with high dimensional data that comes from human demonstrations is often subject to embedding prior knowledge of which variables should be retained and why. This paper proposes an approach for automatizing robot learning through the detection of causalities in the set of variables recorded during demonstration. This allows us to infer a notion of coherence and coordination between multiple systems that apparently work independently. We test the approach on a bimanual scooping task, consisting of multiple phases. We detect the coordination between the two arms, between the arms and the hands and between the fingers of each hand and observe how these coordination patterns change throughout the task.
- T. Asfour, F. Gyarfas, P. Azad, and R. Dillmann. Imitation learning of dual-arm manipulation tasks in humanoid robots. In Humanoid Robots, 2006 6th IEEE-RAS International Conference on, pages 40--47, Dec 2006.Google ScholarCross Ref
- S. Calinon, F. Guenter, and A. Billard. On learning, representing, and generalizing a task in a humanoid robot. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 37(2):286--298, April 2007. Google ScholarDigital Library
- R. S. Johansson, A. Theorin, G. Westling, M. Andersson, Y. Ohki, and L. Nyberg. How a lateralized brain supports symmetrical bimanual tasks. PLoS Biol, 4(6):e158, 05 2006.Google ScholarCross Ref
- A. L. Pais and A. Billard. Encoding bi-manual coordination patterns from human demonstrations. In Proceedings of the 2014 ACM/IEEE International Conference on Human-robot Interaction, HRI '14, pages 264--265. ACM, 2014. Google ScholarDigital Library
- A. K. Seth. A matlab toolbox for granger causal connectivity analysis. Journal of Neuroscience Methods, 186(2):262 -- 273, 2010.Google ScholarCross Ref
- A. Shukla and A. Billard. Coupled dynamical system based arm-hand grasping model for learning fast adaptation strategies. Robot. Auton. Syst., 60(3):424--440, Mar. 2012. Google ScholarDigital Library
Index Terms
- Learning Bimanual Coordinated Tasks From Human Demonstrations
Recommendations
Encoding bi-manual coordination patterns from human demonstrations
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionHumans perform tasks such as bowl mixing bi-manually, but programming them on a robot can be challenging specially in tasks that require force control or on-line stiffness modulation. In this paper we first propose a user-friendly setup for ...
Analyzing Human Behavior and Bootstrapping Task Constraints from Kinesthetic Demonstrations
HRI'15 Extended Abstracts: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended AbstractsIn robot Programming by Demonstration (PbD), the interaction with the human user is key to collecting good demonstrations, learning and finally achieving a good task execution. We therefore take a dual approach in analyzing demonstration data. First we ...
Incremental Learning of Tasks From User Demonstrations, Past Experiences, and Vocal Comments
Since many years the robotics community is envisioning robot assistants sharing the same environment with humans. It became obvious that they have to interact with humans and should adapt to individual user needs. Especially the high variety of tasks ...
Comments