Skip to main content

Annotation of Utterances for Conversational Nonverbal Behaviors

  • Conference paper
  • First Online:
Social Robotics (ICSR 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9979))

Included in the following conference series:

Abstract

Nonverbal behaviors play an important role in communication for both humans and social robots. However, adding contextually appropriate animations by hand is time consuming and does not scale well. Previous researchers have developed automated systems for inserting animations based on utterance text, yet these systems lack human understanding of social context and are still being improved. This work proposes a middle ground where untrained human workers label semantic information, which is input to an automatic system to produce appropriate gestures. To test this approach, untrained workers from Mechanical Turk labeled semantic information, specifically emotion and emphasis, for each utterance, which was used to automatically add animations. Videos of a robot performing the animated dialogue were rated by a second set of participants. Results showed untrained workers are capable of providing reasonable labeling of semantic information and that emotional expressions derived from the labels were rated more highly than control videos. More study is needed to determine the effects of emphasis labels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cassell, J., Vilhjalmsson, H., Bickmore, T.: BEAT: the behavior expression animation toolkit. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 477–486. (2001)

    Google Scholar 

  2. Smid, K., Pandzic, I.S., Radman, V.: Autonomous speaker agent. In: Proceedings of Computer Animation and Social Agents Conference (2004)

    Google Scholar 

  3. Albrecht, I., Haber, J., Seidel, H.P.: Automatic generation of non-verbal facial expressions from speech. In: Advances in Modelling. Animation and Rendering, pp. 283–293. Springer, London (2002)

    Google Scholar 

  4. Perikos, I., Jatzilygeroudis, I.: Recognizing emotions in text using ensemble of classifiers. Eng. Appl. Artif. Intell. 51, 191–201 (2016)

    Article  Google Scholar 

  5. Kiesler, S.C.R.: Fostering common ground in human-robot interaction. In: IEEE International Workshop on Robot and Human Interactive Communication, pp. 729–734 (2005)

    Google Scholar 

  6. Kim, H.H., Lee, H.E., Kim, Y.H., Park, K.H., Bien, Z.Z.: Automatic generation of conversational robot gestures for human-friendly steward robot. In: The 16th IEEE International Symposium on Robot and Human Interactive Communication, pp. 1155–1160 (2007)

    Google Scholar 

  7. Kopp, S., Krenn, B., Marsella, S., Marshal, A.N., Pelachaud, C., Pirker, H., Thorisson, K.R., Vilhjalmsson, H.: Towards a common framework for multimodal generation: the behavior markup language. In: Proceedings of the 6th International Conference on Intelligent Virtual Agents, pp. 205–217 (2006)

    Google Scholar 

  8. Graf, H.P., Cosatto, E., Strom, V., Huan, F.J.: Visual prosody: facial movements accompanying speech. In: Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 396–401 (2002)

    Google Scholar 

  9. Zoric, G., Smid, K., Pandzic, I.S.: Facial gestures: taxonomy and application of non-verbal, non-emotional facial displays for embodied conversational agents. In: Conversational Informatics: An Engineering Approach, pp. 161–182, John Wiley & Sons, Ltd. (2007)

    Google Scholar 

  10. Bartneck, C., Croft, E., Kulic, D., Zoghbi, S.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 1(1), 71–81 (2009)

    Article  Google Scholar 

Download references

Acknowledgements

We are thankful to Disney Research and The Walt Disney Corporation for support of this research effort. This material is based upon research supported by (while Dr. Simmons was serving at) the National Science Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Allison Funkhouser .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Funkhouser, A., Simmons, R. (2016). Annotation of Utterances for Conversational Nonverbal Behaviors. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_51

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47437-3_51

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47436-6

  • Online ISBN: 978-3-319-47437-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics