single-rb.php

JRM Vol.29 No.1 pp. 146-153
doi: 10.20965/jrm.2017.p0146
(2017)

Paper:

Development of a Robotic Pet Using Sound Source Localization with the HARK Robot Audition System

Ryo Suzuki, Takuto Takahashi, and Hiroshi G. Okuno

Waseda University
Lambdax Bldg 3F, 2-4-12 Okubo, Shinjuku, Tokyo 169-0072, Japan

Received:
July 20, 2016
Accepted:
December 15, 2016
Published:
February 20, 2017
Keywords:
robot audition, sound source localization, robotic pet
Abstract
We have developed a self-propelling robotic pet, in which the robot audition software HARK (Honda Research Institute Japan Audition for Robots with Kyoto University) was installed to equip it with sound source localization functions, thus enabling it to move in the direction of sound sources. The developed robot, which is not installed with cameras or speakers, can communicate with humans by using only its own movements and the surrounding audio information obtained using a microphone. We have confirmed through field experiments, during which participants could gain hands-on experience with our developed robot, that participants behaved or felt as if they were touching a real pet. We also found that its high-precision sound source localization could contribute to the promotion and facilitation of human-robot interactions.
Children calling Cocoron to come closer

Children calling Cocoron to come closer

Cite this article as:
R. Suzuki, T. Takahashi, and H. Okuno, “Development of a Robotic Pet Using Sound Source Localization with the HARK Robot Audition System,” J. Robot. Mechatron., Vol.29 No.1, pp. 146-153, 2017.
Data files:
References
  1. [1] H. G. Okuno and K. Nakadai, “Robot audition: its rise and perspectives,” 2015 IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP), 2015.
  2. [2] N. Quang, S. Yun, and J. Choi, “Audio-visual integration for human-robot interaction in multi-person scenarios,” Proc. of the 2014 IEEE Emerging Technology and Factory Automation (ETFA), 2014.
  3. [3] J. Cech, R. Mittal, and A. Deleforge, “Active-speaker detection and localization with microphones and cameras embedded into a robotic head,” 2013 13th IEEE-RAS Int. Conf. on Humanoid Robots (Humanoids), 2013.
  4. [4] K. L. Koay, G. Lakatos, D. S. Syrdal, M. Gácsi, B. Bereczky, K. Dautenhahn, A. Miklósi, and M. L. Walters, “Hey! There is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent,” 2013 IEEE Symposium on Artificial Life (ALife), 2013.
  5. [5] A. Singh and J. E. Young, “Animal-inspired human-robot interaction: A robotic tail for communicating state,” 2012 7th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), 2012.
  6. [6] S. Yohanan and K. E. MacLean, “The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature,” Int. J. of Social Robotics, Vol.4, No.2, pp. 163-180, 2012.
  7. [7] W. Moyle, C. Jones, B. Sung, M. Bramble, S. O’Dwyer, M. Blumenstein, and V. Estivill-Castro, “What Effect Does an Animal Robot Called CuDDler Have on the Engagement and Emotional Response of Older People with Dementia? A Pilot Feasibility Study,” Int. J. of Social Robotics, Vol.8, No.1, pp. 145-156, 2016.
  8. [8] O. Sugiyama, K. Itoyama, K. Nakada, and H. G. Okuno, “Sound annotation tool for multidirectional sounds based on spatial information extracted by HARK robot audition software,” 2014 IEEE Int. Conf. on Systems, Man, and Cybernetics (SMC), 2014.
  9. [9] M. Ohkita, Y. Bando, Y. Ikemiya, K. Itoyama, and K. Yoshii, “Audio-visual beat tracking based on a state-space model for a music robot dancing with humans,” 2015 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2015.
  10. [10] K. Nakadai, T. Mizumoto, and K. Nakamura, “Robot-Audition-based Human-Machine Interface for a Car,” 2015 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2015.
  11. [11] I. Nishimuta, N. Hirayama, K. Yoshii,K. Itoyama, and H. G. Okuno, “A robot quizmaster that can localize, separate, and recognize simultaneous utterances for a fastest-voice-first quiz game,” 2014 IEEE-RAS Int. Conf. on Humanoid Robots, 2014.
  12. [12] M. Otake, M. Nergui, S. Moon, K. Takagi, T. Kamashima, and K. Nakadai, “Development of a sound source localization system for assisting group conversation,” Int. Conf. on Intelligent Robotics and Applications, 2013.
  13. [13] R. Gomez, K. Nakamura, T. Mizumoto, and K. Nakadai, “Compensating changes in speaker position for improved voice-based human-robot communication,” 2015 IEEE-RAS 15th Int. Conf. on Humanoid Robots (Humanoids), 2015.
  14. [14] F. Asano, M. Goto, K. Itou, and H. Asoh, “Real-time sound source localization and separation system and its application to automatic speech recognition,” INTERSPEECH, 2001.
  15. [15] R. Schmidt, “Multiple emitter location and signal parameter estimation,” IEEE Trans. on Antenas and Propagation, Vol.34, No.3, pp. 276-280, 1986.
  16. [16] F. Asano, M. Goto, K. Itou, and H. Asoh, “Real-time Sound Source Localization and Separation System and Its Application to Automatic Speech Recognition,” Proc. of EUROSPEECH 2001, pp. 1013-1016, 2001.
  17. [17] H. Nakajima, K. Nakadai, and Y. Hasegawa, “Blind source separation with parameter-free adaptive step-size method for robot audition,” IEEE Trans. on audio, speech, and language processing, Vol.18, No.6, pp. 1476-1485, 2010.
  18. [18] H. G. Okuno, K. Nakadai, and H. Kim, “Robot audition: Missing feature theory approach and active audition,” Robotics research, pp. 227-244, 2011.
  19. [19] D. Patrick and J. Bonnal, “Information-theoretic detection of broadband sources in a coherent beamspace MUSIC scheme,” 2010 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2010.
  20. [20] A. Lindsey, “Emobie™: A robot companion for children with anxiety,” The Eleventh ACM/IEEE Int. Conf. on Human Robot Interation, 2016.
  21. [21] J. K. Westlund et al., “Tega: A social robot,” The Eleventh ACM/IEEE Int. Conf. on Human Robot Interation, 2016.
  22. [22] E. Kubinyi et al., “Social behaviour of dogs encountering AIBO, an animal-like robot in a neutral and in a feeding situation,” Behavioural processes, Vol.65, No.3, pp. 231-239, 2004.
  23. [23] M. Zhao and A. P. del Pobil, “Is a furry pet more engaging? comparing the effect of the material on the body surface of robot pets,” Social Robotics: 5th Int. Conf. (ICSR 2013), pp. 569-570, Bristol, UK, October 27-29, 2013.
  24. [24] S. Jeong et al., “Designing a socially assistive robot for pediatric care,” Proc. of the 14th Int. Conf. on Interaction Design and Children, 2015.
  25. [25] A. Lazar, H. J. Thompson, A. M. Piper, and G. Demiris, “Rethinking the design of robotic pets for older adults,” Proc. of the 2016 ACM Conf. on Designing Interactive Systems, 2016.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024