Skip to main content
Log in

A multimodal language to communicate with life-supporting robots through a touch screen and a speech interface

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

This article proposes a multimodal language to communicate with life-supporting robots through a touch screen and a speech interface. The language is designed for untrained users who need support in their daily lives from cost-effective robots. In this language, the users can combine spoken and pointing messages in an interactive manner in order to convey their intentions to the robots. Spoken messages include verb and noun phrases which describe intentions. Pointing messages are given when the user’s finger touches a camera image, a picture containing a robot body, or a button on a touch screen at hand which convey a location in their environment, a direction, a body part of the robot, a cue, a reply to a query, or other information to help the robot. This work presents the philosophy and structure of the language.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Prasad R, Saruwatari H, Shikano K (2004) Robots that can hear, understand and talk. Adv Robotics 18(5):533–564

    Article  Google Scholar 

  2. Jurafsky D, Martin JH (2000) Speech and language processing. Prentice Hall, Englewood Cliffs

    Google Scholar 

  3. Bos J, Oka T (2007) A spoken language interface with a mobile robot. Artif Life Robotics 11(1):42–47

    Article  Google Scholar 

  4. Knapp ML, Hall JA (2010) Nonverbal communication in human interaction. Wadsworth, Belmont

    Google Scholar 

  5. Perzanowski D, et al (2001) Building a multimodal human-robot interface. IEEE Intell Syst 16(1):16–21

    Article  Google Scholar 

  6. Iba S, Paredis CJJ, Adams W, et al (2004) Interactive multi-modal robot programming. 9th International Symposium on Experimental Robotics (ISER’ 04), pp 503–512

  7. Igarashi T (2008) User interface for robots (in Japanese). J Robotics Soc Jpn 28(3):246–248

    Google Scholar 

  8. Oka T, Abe T, Shimoji M, et al (2008) Directing humanoids in a multi-modal command language. 17th International Symposium on Robot and Human Interactive Communication

  9. Oka T, Abe T, Sugita K, et al (2009) RUNA: a multi-modal command language for home robot users. Artif Life Robotics 13(2):455–459

    Article  Google Scholar 

  10. Oka T, Abe T, Sugita K, et al (2009) Success rates in a multimodal command language for home robot users. Artif Life Robotics 14(2):219–223

    Article  Google Scholar 

  11. Oka T, Sugita K, Yokota M (2010) Commanding a humanoid to move objects in a multimodal language. J Artif Life Robotics 15(1):17–20

    Article  Google Scholar 

  12. Oka T, Abe T, Sugita K, et al (2011) User study of a life-supporting humanoid directed in a multimodal language. 16th International Symposium on Artificial Life and Robotics (AROB11)

  13. Oka T, Sugita K, Yokota M (2009) Spoken command language to direct a robot cleaner. FAN2009

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to T. Oka.

Additional information

This work was presented in part at the 16th International Symposium on Artificial Life and Robotics, Oita, Japan, January 27–29, 2011

About this article

Cite this article

Oka, T., Matsumoto, H. & Kibayashi, R. A multimodal language to communicate with life-supporting robots through a touch screen and a speech interface. Artif Life Robotics 16, 292–296 (2011). https://doi.org/10.1007/s10015-011-0924-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-011-0924-x

Key words