Skip to main content

Advertisement

Log in

Expressive interfaces

  • Original Article
  • Published:
Cognition, Technology & Work Aims and scope Submit manuscript

Abstract

Analysis of expressiveness in human gesture can lead to new paradigms for the design of improved human-machine interfaces, thus enhancing users’ participation and experience in mixed reality applications and context-aware mediated environments. The development of expressive interfaces decoding the highly affective information gestures convey opens novel perspectives in the design of interactive multimedia systems in several application domains: performing arts, museum exhibits, edutainment, entertainment, therapy, and rehabilitation. This paper describes some recent developments in our research on expressive interfaces by presenting computational models and algorithms for the real-time analysis of expressive gestures in human full-body movement. Such analysis is discussed both as an example and as a basic component for the development of effective expressive interfaces. As a concrete result of our research, a software platform named EyesWeb was developed (http://www.eyesweb.org). Besides supporting research, EyesWeb has also been employed as a concrete tool and open platform for developing real-time interactive applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.

Similar content being viewed by others

References

  • Bobick AF, Davis J (2001) The recognition of human movement using temporal templates. In: IEEE Trans Pattern Anal Mach Intell 23(3):257–267

  • Boone RT, Cunningham JG (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34:1007–1016

    Article  CAS  PubMed  Google Scholar 

  • Camurri A, Coletta P, Ricchetti M, Volpe G (2000a) Expressiveness and physicality in interaction. J New Music Res 29(3):187–198

    Article  Google Scholar 

  • Camurri A, Coletta P, Peri M, Ricchetti M, Ricci A, Trocca R, Volpe G (2000b) A real-time platform for interactive dance and music systems. In: Proceedings of the international conference on ICMC (ICMC2000), Berlin, 2000

  • Camurri A, De Poli G, Leman M, Volpe G (2001) A multi-layered conceptual framework for expressive gesture applications. In: Proceedings of the workshop on current research directions in computer music, Barcelona, November 2001

  • Camurri A, Lagerlöf I, Volpe G (2003) Emotions and cue extraction from dance movements. Int J Hum Comput Stud 59(1–2):213–225

    Google Scholar 

  • Chi D, Costa M, Zhao L, Badler N (2000) The EMOTE model for effort and shape. In: ACM SIGGRAPH’00, New Orleans, L.A., July 2000, pp 173–182

  • Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, no 1, January 2001

  • Hashimoto S (1997) KANSEI as the third target of information processing and related topics in Japan. In: Camurri A (ed) Proceedings of the international workshop on KANSEI: the technology of emotion. AIMI and DIST-University of Genova, pp 101–104

  • Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211

    Google Scholar 

  • Laban R, Lawrence FC (1947) Effort. Macdonald and Evans, London

  • Laban R (1963) Modern educational dance. Macdonald and Evans, London

  • Leman M, Lesaffre M, Tanghe K (2001) A toolbox for perception-based music analysis. IPEM—Department of Musicology, Ghent University, http://www.ipem.rug.ac.be/toolbox

  • Leman M, Vermeulen V, De Vooght L, Taelman J, Moelants D, Lesaffre M (2003) Correlation of gestural audio cues and perceived expressive qualities. In: Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction. Lecture notes in artificial intelligence, vol 2915. Springer, Berlin Heidelberg New York

  • Schaeffer P (1977) Traité des Objets Musicaux, 2nd edn. Editions du Seuil, Paris

  • Suzuki, K, Camurri A, Hashimoto S, Ferrentino P (1998) Intelligent agent system for human-robot interaction through artificial emotion. Proceedings of the IEEE international conference on systems, man, and cybernetics, IEEE CS Press, San Diego, pp1055–1060

  • Volpe G (2003) Computational models of expressive gesture in multimedia systems. PhD dissertation, University of Genova, Faculty of Engineering

  • Wallbott HG (1980) The measurement of human expressions. In: von Rallfer-Engel W (ed) Aspects of communications. Lisse, Swets and Zeitlinger, pp 203–228

Download references

Acknowledgments

We thank our colleagues at the DIST – InfoMus Lab and particularly Paolo Coletta, Massimiliano Peri, Matteo Ricchetti, Andrea Ricci, and Riccardo Trocca. We thank Ingrid Lagerlöf and Marie Djerf from the University of Uppsala who collaborated in the analysis of basic emotions in dance fragments.

This research is partially funded by the EU IST Project MEGA (Multisensory Expressive Gesture Applications) no. IST-1999–20410 (http://www.megaproject.org).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Antonio Camurri.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Camurri, A., Mazzarino, B. & Volpe, G. Expressive interfaces. Cogn Tech Work 6, 15–22 (2004). https://doi.org/10.1007/s10111-003-0138-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10111-003-0138-7

Keywords

Navigation