single-rb.php

JRM Vol.19 No.2 pp. 212-222
doi: 10.20965/jrm.2007.p0212
(2007)

Paper:

Plastic-Bottle-Based Robots in Educational Robotics Courses – Understanding Embodied Artificial Intelligence –

Kojiro Matsushita, Hiroshi Yokoi, and Tamio Arai

Dept. of Precision Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

Received:
November 2, 2006
Accepted:
February 6, 2007
Published:
April 20, 2007
Keywords:
edutainment, embodied artificial intelligence, locomotion, morphology, EMG
Abstract
In this paper, we introduce an educational robotics approach featuring a unique robotic development kit - hardware, software, and instructions - that concretely encourages student interest in and curiosity about science and technology. The kit was developed based on practical policies: easy construction, low cost, creative activity, and enjoyable education. It uses common materials such as plastic bottles, RC servomotors, and hot glue, and provides three different controllers with instructions - a sensor-motor controller, an electromyography (EMG) interface controller, and a teaching-playback controller. The kit thus offers more custom access to both robot structure and control architecture than similar kits and encourages students to become engaged creatively. The three robotics courses for undergraduates and graduates we have conducted thus far to provide an understanding of robotics and embodied artificial intelligence (AI) have confirmed that some of locomotive robots explicitly exploit their own dynamics - also known as “morph-functionality” - an embodied AI concept. An evaluation of this approach for course hours, task achievement, student interest, and the influence of assistance confirmed conclusively that students experienced creativity in such robotics courses.
Cite this article as:
K. Matsushita, H. Yokoi, and T. Arai, “Plastic-Bottle-Based Robots in Educational Robotics Courses – Understanding Embodied Artificial Intelligence –,” J. Robot. Mechatron., Vol.19 No.2, pp. 212-222, 2007.
Data files:
References
  1. [1] K-TEAM Corporation website,
    from http://www.k-team.com/
  2. [2] Sony website,
    from http://www.sony.net/Products/aibo/index.html
  3. [3] KONDI KAGAKU CO., LTD website, from
    http://www.kondo-robot.com/html/Info_En.html
  4. [4] The LEGO Group website,
    from http://mindstorms.lego.com/
  5. [5] Center for Engineering Education Outreach at Tufts University website,
    from http://www.ceeo.tufts.edu/
  6. [6] C. Paul, V. Hafner, and J. C. Bongard, “Teaching New Artificial Intelligence using Constructionist Edutainment Robots,” Workshop on Edutainment Robots, 2000.
  7. [7] J. Wakeman-Linn and A. Perry, “A proposal to incorporate LEGO® Mindstorms into an introduction to engineering course,” Proceedings of ASEE Annual Conference and Exposition: Vive L’ingenieur, pp. 9231-9238, 2002.
  8. [8] P. S. Brian, J. J. Wood, and D. Hansen, “Teaching undergraduate kinetics using LEGO® Mindstorms racecar competition,” Proceedings of 2004 Annual Conference and Exposition, “Engineering Education Research New Heights,” pp. 13841-13848, 2004.
  9. [9] E. Wang and R. Wang, “Using LEGOs and RoboLab (LabVIEW) with elementary school children,” Proceedings -Frontiers in Education Conference, v 1, T2E/11, 2001.
  10. [10] H. H. Lund and L. Pagliarini, “RoboCup Jr. with LEGO MINDSTORMS,” Proceedings IEEE International Conference on Robotics and Automation, pp. 813-819, 2000.
  11. [11] U. Petersen, M. Müllerburg, and G. Theidig, “Girls and Robots – A Promising Alliance,” ERCIM News No.53, pp. 32-33, 2003.
  12. [12] M. Müllerburg, U. Petersen, and G. Theidig, “Roboter in Bildung uns Ausbildung,” in: M. Calm (Hrsg.) FINUT-28. Kongress von Frauen in Naturwissenschaft und Technik, Darmstadt: FIT-Verlag, Dez. 2002, S. pp. 227-234, ISBN 3-933611-28-8.
  13. [13] R. Pfeifer and C. Scheier, “Understanding Intelligence,” MIT Press, 1999.
  14. [14] H. Yokoi, A. H. Arieta, R. Katoh, W. Yu, I. Watanabe, and M. Maruishi, “Mutual Adaptation in a Prosthetics Application,” LNAI3139: Embodied Artificial Intelligence, Springer, ISBN 3-540-22484-X, pp. 146-159, 2004.
  15. [15] H. Kazerooni, “Berkeley Lower Extremity Exo-skeleton (BLEEX),” Mechanical Engineering Department of U. C., Berkeley,
    at URL http://www.me.berkeley.edu , 2004.
  16. [16] H. Kawamoto and Y. Sankai, “EMG-based hybrid assistive leg for walking aid using feedforward controller,” International Conference on Control, Automation and system, pp. 190-193, 2001.
  17. [17] R. Brooks and L. Stein, “Building Brains for Bodies,” Autonomous Robots, Vol.1, Issue 1, pp. 7-25, 1994.
  18. [18] K. Sims, “Evolving Virtual Creatures,” Computer Graphics Annual Conference Proceedings, pp. 43-50, 1994.
  19. [19] M. Hirose, Y. Haikawa, T. Takenaka, and K. Hirai, “Development of Humanoid Robot ASIMO,” Proc. Int. Conference on Intelligent Robots and System, 2001.
  20. [20] T. McGeer, “Passive dynamic walking,” Int. J. Robotics Research, 9(2), pp. 62-82, 1990.
  21. [21] S. Collins and A. Ruina, “A bipedal walking robot with efficient and human-like gait,” in Proc. IEEE International Conference on Robotics and Automation, 2005.
  22. [22] F. Hara and R. Pfeifer, “On the relation among morphology, material and control in morpho-functional machines,” in Meyer, Berthoz, Floreano, Roitblat, and Wilson (Eds.), “From Animals to Animats 6,” Proceedings of the sixth International Conference on Simulation of Adaptive Behavior, pp. 33-40, 2000.
  23. [23] P. Pfeifer and J. Bongard, “How the Body Shapes the Way We Think: A New View of Intelligence,” MIT Press, 2006.
  24. [24] K. Matsushita, M. Lungarella, C. Paul, and H. Yokoi, “Locomoting with less computation but more morphology,” Proceedings of 20th International Conference on Robotics and Automation, pp. 2020-2025, 2005.
  25. [25] R. Alexander, “Principles of Animal Locomotion,” Princeton University Press, 2002.
  26. [26] S. Vogel, “Cat’s Paws and Catapults,” W. W. Norton & Company, 1998.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024