single-rb.php

JRM Vol.24 No.1 pp. 235-243
doi: 10.20965/jrm.2012.p0235
(2012)

Development Report:

Design and Development of Human Interface System with 3D Measurement Functions (Concept and Basic Experiments)

Jianming Yang and Takashi Imura

The Faculty of Science and Technology, Meijo University, 1-501 Shiogamaguchi, Tempa-ku, Nagoya 468-8502, Japan

Received:
February 7, 2011
Accepted:
June 6, 2011
Published:
February 20, 2012
Keywords:
three-dimensional measurement, 3D joystick, human interface, coordinate transformation, presenting information
Abstract
In this paper, an intelligent interface system for a wheelchair robot with an arm is described. The human interface system is comprised of an interface device and a signal processing module. An original idea of the authors, the interface device, which is a combination of a joystick, a Position Sensitive Detector (PSD), and a CCD camera, can realize threedimensional measurement in addition to performing the basic functions of a joystick. A principle for measuring three-dimensional positions and a method of presenting information by means of a Graphical User Interface (GUI) to verify and understand the instructions are proposed. To demonstrate the effectiveness of the intelligent operating system, a basic experiment is done.
Cite this article as:
J. Yang and T. Imura, “Design and Development of Human Interface System with 3D Measurement Functions (Concept and Basic Experiments),” J. Robot. Mechatron., Vol.24 No.1, pp. 235-243, 2012.
Data files:
References
  1. [1] R. J. Anderson and M. W. Spong, “Bilateral Control of Teleoperators with Time Delay,” IEEE Trans. on Automatic Control, Vol.34, No.5, pp. 494-501, 1989.
  2. [2] G. M. H. Leung, B. A. Francis, and J. Apkarian, “Bila-teral Controller for Teleoperators with Time Delay via µ-Synthesis,” IEEE Trans. on Automatic Control, Vol.11, No.1, pp. 105-116, 1995.
  3. [3] Y. Yokokohji, “Interface Design for Rescue Robot Operation – Introduction of Research Outcomes from the Human-interface Group of the DDT Project,” J. of the Robotics Society of Japan, Vol.22, No.5, pp. 566-569, 2004.
  4. [4] M. Uchiyama, S. Kaneda, and K. Kitagaki, “A Teleoperated Force Control System for Space Robots,” J. of the Robotics Society of Japan, Vol.9, No.7, pp. 47-54, 1991.
  5. [5] E. S. Boy, C. L. Teo, and E. Burdet, “Collaborative wheelchair assistant,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Vol.2, pp. 1511-1516, 2002.
  6. [6] J.W.Min, K. B. Lee, S. C. Lim, and D. S. Kwon, “Hu-man-friendly interfaces of wheelchair robotic system for handicapped persons,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Vol.2, pp. 1505-1510, 2002.
  7. [7] Y. Murakami, Y. Kuno, N. Shimada, and Y. Shirai, “Collision Avoidance by Observing Pedestrians’ Faces for Intelligent Wheelchairs,” J. of the Robotics Society of Japan, Vol.20, No.2, pp. 206-213, 2002.
  8. [8] T. Yoshimura, A. Nakamura, and Y. Kuno, “Intelligent wheelchair system that work together to caregivers,” Proc. of Eighth Symposium on Sensing via Image (SSII 2002), pp. 489-492, 2002.
  9. [9] Y. Kuno, N. Shimada, and Y. Shirai, “Look where you’re going: A robotic wheelchair based on the inte-gration of human and environmental observations,” IEEE Robotics and Automation, Vol.10, No.1, pp. 26-34, 2003.
  10. [10] H. S. Hong, S.-Y. Jung, J.-H. Jung, B.-G. Lee, and J. W. Kang, “Development of Work Assistant Mobile Robot System for the Handicapped in a Real Manufacturing Environment,” Proc. of the 2005 IEEE 9th Int. Conf. on Rehabilitation Robotics, pp. 179-200, 2005.
  11. [11] H. Kasahara and J. Yang, “A Study on The system Design of The robot arm for electric wheelchairs,” Proc. Of 56th General Conf. of The Japan Society of Mechanical Engineers Tokai Branch, pp. 126-127, 2010.
  12. [12] B. Tondu and N. Bardou, “Anthropomorphism in Robotics Engineering for Disabled People,” World Academy of Science, Engineering and Technology, Vol.52, pp. 533-542, 2009.
  13. [13] W.-H. Zhu, S. E. Salcudean, S. Bachman, and P. Abol-maesumi, “Motion/Force/Image Control of A Diag-nostic Ultrasound Robot,” Proc. of 2000 Int. Conf. Robotics and Automation, pp. 1580-1585, 2000.
  14. [14] J. Gonzalez, A. J. Muaeoz, C. Galindo, J. A. Fernandez-Madrigal, and J. L. Blanco, “A Description of the SENA Robotic Wheelchair,” Proc. of the IEEE MELECON Conf., Spain, pp. 437-440, 2006.
  15. [15] Y. S. Choi, C. D. Anderson, J. D. Glass, and C. C. Kemp, “Laser Pointers and a Touch Screen Intuitive Interfaces for Autonomous Mobile Manipulation for the Motor Impaired,” Proc. of the ACM SIGACCESS Conf. on Computers and Accessibility, pp. 225-232, 2008.
  16. [16] K. Ishii, S. Zhao, M. Inami, T. Igarashi, and M. Imai, “Designing Laser Gesture Interface for Robot Control,” Proc. of the 12th IFIP Conf. on Human-Computer Interaction, INTERACT2009, pp. 479-492.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024