single-jc.php

JACIII Vol.14 No.7 pp. 758-769
doi: 10.20965/jaciii.2010.p0758
(2010)

Paper:

Attentive Deskwork Support System

Yusuke Tamura, Masao Sugi, Tamio Arai, and Jun Ota

The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 Japan

Received:
March 31, 2010
Accepted:
August 8, 2010
Published:
November 20, 2010
Keywords:
human-robot interaction, attentive user interfaces, deskwork
Abstract
We propose an attentive deskwork support system that quickly delivers required objects to people who work at desks. To meet this goal, we propose methods to understand a user’s request for support. To know the existence of the user request, the system uses the characteristics of the user’s hand and eye movements and detects hand-reaching movements. The system understands the content of the user’s request by integrating sensory and contextual information using a probabilistic model. Finally, the system determines a point of delivery by predicting a user’s hand movement and delivers required objects by using self-moving trays. The experiments are conducted to evaluate the usefulness of the system proposed here.
Cite this article as:
Y. Tamura, M. Sugi, T. Arai, and J. Ota, “Attentive Deskwork Support System,” J. Adv. Comput. Intell. Intell. Inform., Vol.14 No.7, pp. 758-769, 2010.
Data files:
References
  1. [1] Y. Tamura, M. Sugi, T. Arai, and J. Ota, “Estimation of user’s request for attentive deskwork support system,” Cutting Edge Robotics 2009, 2010 (in Press).
  2. [2] S. Kajikawa, T. Okino, K. Ohba, and H. Inooka, “Motion planning for hand-over between human and robot,” Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 193-199, 1995.
  3. [3] A. Agah and K. Tanie, “Human interaction with a service robot: Mobile-manipulator handing over an object to a human,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 575-580, 1997.
  4. [4] T. Sato, T. Harada, and T. Mori, “Environment-type robot system “robotic room” featured by behavior media, behavior contents, and behavior adaptation,” IEEE/ASME Trans. on Mechatronics, Vol.9, No.3, pp. 529-534, 2004.
  5. [5] B. A. Sawyer, “Magnetic positioning device,” US Patent, Vol.3, pp. 457-482, 1969.
  6. [6] J. L. Dallaway and R. D. Jackson, “The user interface for interactive robotic workstations,” Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1682-1686, 1994.
  7. [7] S. Ishii, S. Tanaka, and F. Hiramatsu, “Meal assistance robot for severely handicapped people,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 1308-1313, 1995.
  8. [8] R. Cipolla and N. J. Hollinghurst, “Human-robot interface by pointing with uncalibrated stereo vision,” Image and Vision Computing, Vol.14, pp. 171-178, 1996.
  9. [9] Y. Tamura, M. Sugi, T. Arai, and J. Ota, “Target identification through human pointing gesture based on human-adaptive approach,” J. of Robotics and Mechatronics, Vol.20, No.4, pp. 515-525, 2008.
  10. [10] J. R. Millán, F. Renkens, J. Mouriño, and W. Gerstner, “Noninvasive brain-actuated control of a mobile robot by human EEG,” IEEE Trans. on Biomedical Engineering, Vol.51, No.6, pp. 1026-1033, 2004.
  11. [11] O. Fukuda, T. Tsuji, M. Kaneko, and A. Otsuka, “A human-assisting manipulator teleoperated by EMG signals and arm motions,” IEEE Trans. on Robotics and Automation, Vol.19, No.2, pp. 210-222, 2003.
  12. [12] C. Prablanc, J. F. Echallier, E. Komilis, and M. Jeannerod, “Optimal response of eye and hand motor system in pointing a visual target,” Biological Cybernetics, Vol.35, pp. 113-124, 1979.
  13. [13] T. Flash and N. Hogan, “The coordination of arm movements: An experimentally confirmed mathematical model,” The J. of Neuroscience, Vol.5, No.7, pp. 1688-1703, 1985.
  14. [14] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” IEEE Computer Graphics and Applications, Vol.22, No.6, pp. 64-71, 2002.
  15. [15] V. Raghavan, J. Molineros, and R. Sharma, “Interactive evaluation of assembly sequences using augmented reality,” IEEE Trans. on Robotics and Automation, Vol.15, No.3, pp. 435-449, 1999.
  16. [16] P. Wellner, “Interacting with paper on the DigitalDesk,” Communications of the ACM, Vol.36, No.7, pp. 87-96, 1993.
  17. [17] H. Koike, Y. Sato, and Y. Kobayashi, “Integrating paper and digital information on EnhancedDesk: A method for realtime finger tracking on an augmented desk system,” ACM Trans. on Computer-Human Interaction, Vol.8, No.4, pp. 307-322, 2001.
  18. [18] B. Leibe, T. Starner, W. Ribarsky, Z. Wartell, D. Krum, J. Weeks, B. Singletary, and L. Hodges, “Toward spontaneous interaction with the perceptive workbench,” IEEE Computer Graphics and Applications, Vol.20, No.6, pp. 54-65, 2000.
  19. [19] J. Rekimoto, “SmartSkin: An infrastructure for freehand manipulation on interactive surfaces,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 113-120, 2002.
  20. [20] M. Topping, “An overview of the development of Handy 1, a rehabilitation robot to assist the severely disabled,” J. of Intelligent and Robotic Systems, Vol.34, pp. 253-263, 2002.
  21. [21] M. Sugi, I. Matsumura, Y. Tamura, J. Ota, and T. Arai, “Quantitative evaluation of human supporting production system “Attentive Workbench”,” Proc. of the IEEE Conf. on Automation Science and Engineering, pp. 531-535, 2007.
  22. [22] Y. Kado, T. Kamoda, Y. Yoshiike, P. R. De Silva and M. Okada, “Sociable Dining Table: The Effectiveness of a “KonKon” Interface for Reciprocal Adaptation,” Proc. of the 5th ACM/IEEE Int. Conf. on Human-robot interaction, pp. 105-106, 2010.
  23. [23] P. P. Maglio, R. Barrett, C. S. Campbell, and T. Selker, “SUITOR: An attentive information system,” Proc. of the Int. Conf. on Intelligent User Interfaces, pp. 169-176, 2000.
  24. [24] P. P. Maglio, T. Matlock, C. S. Campbell, S. Zhai, and B. A. Smith, “Gaze and speech in attentive user interfaces,” Lecture Notes in Computer Science, Springer, Vol.1948, pp. 1-7, 2000.
  25. [25] P. P. Maglio and C. S. Campbell, “Attentive agents,” Communications of the ACM, Vol.46, No.3, pp. 47-51, 2003.
  26. [26] R. Vertegaal, “Designing attentive interfaces,” Proc. of the 2002 Symposium on Eye Tracking Research and Applications, pp. 23-30, 2002.
  27. [27] R. Vertegaal, “Attentive user interfaces,” Communications of the ACM, Vol.46, No.3, pp.31-33, 2003.
  28. [28] S. Zhai, “What’s in the eyes for attentive input,” Communications of the ACM, Vol.46, No.3, pp.34-39, 2003.
  29. [29] J. S. Bradbury, J. S. Shell, and C. B. Knowles, “Hands on cooking: Towards an attentive kitchen,” Extended Abstracts on Human Factors in Computing Systems, pp. 996-997, 2003.
  30. [30] T. Selker, “Visual attentive interfaces,” BT Technology J., Vol.22, No.4, pp. 146-150, 2004.
  31. [31] D. Chen and R. Vertegaal, “Using mental load for managing interruptions in physiologically attentive user interfaces,” Extended Abstracts on Human Factors in Computing Systems, pp. 1513-1516, 2004.
  32. [32] V. Novak, C. Sandor, and G. Klinker, “An AR workbench for experimenting with attentive user interfaces,” Proc. of the IEEE/ACMInt. Symposium on Mixed and Augmented Reality, pp. 284-285, 2004.
  33. [33] R. J. K. Jacob, “What you look at is what you get: Eye movementbased interaction techniques,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 11-18, 1990.
  34. [34] S. Zhai, C. Morimoto, and S. Ihde, “Manual and gaze input cascaded (MAGIC) pointing,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 246-253, 1999.
  35. [35] A. Monden, K. Matsumoto, and M. Yamato, “Evaluation of gazeadded target selection methods suitable for general GUIs,” Int. J. of Computer Applications in Technology, Vol.24, No.1, pp. 17-24, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024