Loading [a11y]/accessibility-menu.js
Trial and Error: Using Previous Experiences as Simulation Models in Humanoid Motor Learning | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Tuesday, 25 February, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (1800-2200 UTC). During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Trial and Error: Using Previous Experiences as Simulation Models in Humanoid Motor Learning


Abstract:

Since biological systems have the ability to efficiently reuse previous experiences to change their behavioral strategies to avoid enemies or find food, the number of req...Show More

Abstract:

Since biological systems have the ability to efficiently reuse previous experiences to change their behavioral strategies to avoid enemies or find food, the number of required samples from real environments to improve behavioral policy is greatly reduced. Even for real robotic systems, it is desirable to use only a limited number of samples from real environments due to the limited durability of real systems to reduce the required time to improve control performance. In this article, we used previous experiences as environmental local models so that the movement policy of a humanoid robot can be efficiently improved with a limited number of samples from its real environment. We applied our proposed learning method to a real humanoid robot and successfully achieve two challenging control tasks. We applied our proposed learning approach to acquire a policy for a cart-pole swing-up task in a real-virtual hybrid task environment, where the robot waves a PlayStation (PS) Move motion controller to move a cart-pole in a virtual simulator. Furthermore, we applied our proposed method to a challenging basketball-shooting task in a real environment.
Published in: IEEE Robotics & Automation Magazine ( Volume: 23, Issue: 1, March 2016)
Page(s): 96 - 105
Date of Publication: 24 February 2016

ISSN Information:


References

References is not available for this document.