skip to main content
10.1145/3136755.3136757acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Low-intrusive recognition of expressive movement qualities

Published: 03 November 2017 Publication History

Abstract

In this paper we present a low-intrusive approach to the detection of expressive full-body movement qualities. We focus on two qualities: Lightness and Fragility and we detect them using the data captured by four wearable devices, two Inertial Movement Units (IMU) and two electromyographs (EMG), placed on the forearms. The work we present in the paper stems from a strict collaboration with expressive movement experts (e.g., contemporary dance choreographers) for defining a vocabulary of basic movement qualities. We recorded 13 dancers performing movements expressing the qualities under investigation. The recordings were next segmented and the perceived level of each quality for each segment was ranked by 5 experts using a 5-points Likert scale. We obtained a dataset of 150 segments of movement expressing Fragility and/or Lightness. In the second part of the paper, we define a set of features on IMU and EMG data and we extract them on the recorded corpus. We finally applied a set of supervised machine learning techniques to classify the segments. The best results for the whole dataset were obtained with a Naive Bayes classifier for Lightness (F-score 0.77), and with a Support Vector Machine classifier for Fragility (F-score 0.77). Our approach can be used in ecological contexts e.g., during artistic performances.

References

[1]
Sarah Fdili Alaoui, Frederic Bevilacqua, and Christian Jacquemin. 2015. Interactive Visuals As Metaphors for Dance Movement Qualities. ACM Trans. Interact. Intell. Syst. 5, 3, Article 13 (Sept. 2015), 24 pages.
[2]
P. Alborno, G. Volpe, M. Mancini, R. Niewiadomski, S. Piana, and A. Camurri. 2017. The Multi-Event-Class Synchronization (MECS) Algorithm. Submitted to IEEE Transactions on Human-Machine Systems (2017).
[3]
Zainal Arief, Indra Adji Sulistijono, and Roby Awal Ardiansyah. 2015. Comparison of five time series EMG features extractions using Myo Armband. In Electronics Symposium (IES), 2015 International. IEEE, 11–14.
[4]
Anthony H Bateman, Alison H McGregor, Anthony MJ Bull, Peter MM Cashman, and Robert C Schroter. 2006. Assessment of the timing of respiration during rowing and its relationship to spinal kinematics. Biology of Sport 23, 4 (2006), 353.
[5]
Paolo Bernasconi and Jana Kohl. 1993. Analysis of co-ordination between breathing and exercise rhythms in man. J. Physiol 471 (1993), 693–706.
[6]
Antonio Camurri, Ingrid Lagerlöf, and Gualtiero Volpe. 2003. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. Int. J. Hum.-Comput. Stud. 59, 1-2 (2003), 213–225.
[7]
Antonio Camurri, Barbara Mazzarino, Matteo Ricchetti, Renee Timmers, and Gualtiero Volpe. 2004. Multimodal Analysis of Expressive Gesture in Music and Dance Performances. In Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15- 17, 2003, Selected Revised Papers, Antonio Camurri and Gualtiero Volpe (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 20–39.
[8]
Antonio Camurri, Barbara Mazzarino, and Gualtiero Volpe. 2004. Expressive interfaces. Cognition, Technology & Work 6, 1 (2004), 15–22.
[9]
Antonio Camurri, Gualtiero Volpe, Stefano Piana, Maurizio Mancini, Radoslaw Niewiadomski, Nicola Ferrari, and Corrado Canepa. 2016. The dancer in the eye: towards a multi-layered computational framework of qualities in movement. In Proceedings of the 3rd International Symposium on Movement and Computing. ACM, 6.
[10]
George Caridakis, Amaryllis Raouzaiou, Elisabetta Bevacqua, Maurizio Mancini, Kostas Karpouzis, Lori Malatesta, and Catherine Pelachaud. 2007. Virtual agent multimodal mimicry of humans. Language Resources and Evaluation 41, 3 (2007), 367–388.
[11]
Simon Fothergill, Helena M. Mentis, Pushmeet Kohli, and Sebastian Nowozin. 2012. Instructing People for Training Gestural Interactive Systems. ACM, 1737– 1746.
[12]
Rui Alberto Esteves Freixo. 2015. Electromyography and inertial sensor-based gesture detection and control. Master’s thesis. Universidade de Porto.
[13]
Kozaburo Hachimura, Katsumi Takashina, and Mitsu Yoshimura. 2005. Analysis and evaluation of dancing movement based on LMA. In Robot and Human Interactive Communication, 2005. ROMAN 2005. IEEE International Workshop on. IEEE, 294–299.
[14]
Alexandros Kitsikidis, Kosmas Dimitropoulos, Erdal Yilmaz, Stella Douka, and Nikos Grammalidis. 2014.
[15]
Multi-sensor Technology and Fuzzy Logic for Dancer’s Motion Analysis and Performance Evaluation within a 3D Virtual Environment. In Universal Access in Human-Computer Interaction. Design and Development Methods for Universal Access: 8th International Conference, UAHCI 2014, Held as Part of HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part I, Constantine Stephanidis and Margherita Antona (Eds.). Springer International Publishing, 379–390.
[16]
Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (2013), 15–33.
[17]
Rudolf Laban and Frederick Charles Lawrence. 1947. Effort. Macdonald & Evans.
[18]
João Diogo Faria Lopes. 2016. Gesture Spotting from IMU and EMG Data for Human-Robot Interaction. Master’s thesis. Universidade de Coimbra.
[19]
Vincenzo Lussu, Radoslaw Niewiadomski, Gualtiero Volpe, and Antonio Camurri. 2016. Using the Audio Respiration Signal for Multimodal Discrimination of Expressive Movement Qualities. In Human Behavior Understanding: 7th International Workshop, HBU 2016, Amsterdam, The Netherlands, October 16, 2016, Proceedings, Mohamed Chetouani, Jeffrey Cohn, and Albert Ali Salah (Eds.). Springer International Publishing, 102–115.
[20]
Aymeric Masurelle, Slim Essid, and Gaël Richard. 2013. Multimodal classification of dance movements using body joint trajectories and step sounds. In Image Analysis for Multimedia Interactive Services (WIAMIS), 2013 14th International Workshop on. IEEE, 1–4.
[21]
Michael Neff and Eugene Fiume. 2003. Aesthetic Edits for Character Animation. In Proceedings of the 2003 ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA ’03). Eurographics Association, 239–244.
[22]
Radoslaw Niewiadomski, Maurizio Mancini, and Stefano Piana. 2013. Human and virtual agent expressive gesture quality analysis and synthesis. Coverbal Synchrony in Human-Machine Interaction (2013), 269–292.
[23]
Prajwal Paudyal, Ayan Banerjee, and Sandeep K.S. Gupta. 2016. SCEPTRE: A Pervasive, Non-Invasive, and Programmable Gesture Recognition Technology. In Proceedings of the 21st International Conference on Intelligent User Interfaces (IUI ’16). ACM, New York, NY, USA, 282–293.
[24]
Rodrigo Quian Quiroga, Thoma Kreuz, and Peter Grassberger. 2002. Event synchronization: A simple and fast method to measure synchronicity and time delay patterns. Phys. Rev. E 66 (Oct 2002), 041904. Issue 4.
[25]
Bernstein Ran, Shafir Tal, Tsachor Rachelle, Studd Karen, and Schuster Assaf. 2015. Multitask Learning for Laban Movement Analysis. In Proceedings of the 2Nd International Workshop on Movement and Computing (MOCO ’15). 37–44.
[26]
Dilip Swaminathan, Harvey Thornburg, Jessica Mumford, Stjepan Rajko, Jodi James, Todd Ingalls, Ellen Campana, Gang Qian, Pavithra Sampath, and Bo Peng. 2009. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis. Advances in Human-Computer Interaction 362651 (2009), 17.
[27]
Mahmoud Tavakoli, Carlo Benussi, and Joao Luis Lourenco. 2017. Single channel surface EMG control of advanced prosthetic hands: A simple, low cost and efficient approach. Expert Systems with Applications 79 (2017), 322 – 332.
[28]
Arthur Truong, Hugo Boujut, and Titus Zaharia. 2016. Laban descriptors for gesture recognition and emotional analysis. The Visual Computer 32, 1 (2016), 83–98.
[29]
Nicholas Ward, Miguel Ortiz, Francisco Bernardo, and Atau Tanaka. 2016. Designing and Measuring Gesture Using Laban Movement Analysis and Electromyogram. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp ’16). ACM, New York, NY, USA, 995–1000.

Cited By

View all
  • (2024)Diffusion-Based Unsupervised Pre-training for Automated Recognition of Vitality FormsProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656689(1-9)Online publication date: 3-Jun-2024
  • (2024)“It’s Like Being on Stage”: Staging an Improvisational Haptic-Installed Contemporary Dance PerformanceHaptics: Understanding Touch; Technology and Systems; Applications and Interaction10.1007/978-3-031-70058-3_41(507-518)Online publication date: 30-Jun-2024
  • (2023)Embracing the messy and situated practice of dance technology designProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596078(1383-1397)Online publication date: 10-Jul-2023
  • Show More Cited By

Index Terms

  1. Low-intrusive recognition of expressive movement qualities

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICMI '17: Proceedings of the 19th ACM International Conference on Multimodal Interaction
      November 2017
      676 pages
      ISBN:9781450355438
      DOI:10.1145/3136755
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 03 November 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. EMG
      2. HCI
      3. IMU
      4. dance
      5. expressive qualities

      Qualifiers

      • Research-article

      Conference

      ICMI '17
      Sponsor:

      Acceptance Rates

      ICMI '17 Paper Acceptance Rate 65 of 149 submissions, 44%;
      Overall Acceptance Rate 453 of 1,080 submissions, 42%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)25
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 08 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Diffusion-Based Unsupervised Pre-training for Automated Recognition of Vitality FormsProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656689(1-9)Online publication date: 3-Jun-2024
      • (2024)“It’s Like Being on Stage”: Staging an Improvisational Haptic-Installed Contemporary Dance PerformanceHaptics: Understanding Touch; Technology and Systems; Applications and Interaction10.1007/978-3-031-70058-3_41(507-518)Online publication date: 30-Jun-2024
      • (2023)Embracing the messy and situated practice of dance technology designProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596078(1383-1397)Online publication date: 10-Jul-2023
      • (2023)Affect Recognition in Hand-Object Interaction Using Object-Sensed Tactile and Kinematic DataIEEE Transactions on Haptics10.1109/TOH.2022.323064316:1(112-117)Online publication date: 1-Jan-2023
      • (2023)Modeling Multiple Temporal Scales of Full-Body Movements for Emotion ClassificationIEEE Transactions on Affective Computing10.1109/TAFFC.2021.309542514:2(1070-1081)Online publication date: 1-Apr-2023
      • (2022)Movement Analysis and Decomposition with the Continuous Wavelet TransformProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537998(1-13)Online publication date: 22-Jun-2022
      • (2022)SpineCurer: An inertial measurement unit based scoliosis training systemProceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3517428.3550393(1-4)Online publication date: 23-Oct-2022
      • (2022)CO/DA: Live-Coding Movement-Sound Interactions for Dance ImprovisationProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501916(1-13)Online publication date: 29-Apr-2022
      • (2022)Graph Laplacian-Improved Convolutional Residual Autoencoder for Unsupervised Human Action and Emotion RecognitionIEEE Access10.1109/ACCESS.2022.322947810(131128-131143)Online publication date: 2022
      • (2021)The Body Beyond Movement: (Missed) Opportunities to Engage with Contemporary Dance in HCIProceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3430524.3440624(1-9)Online publication date: 14-Feb-2021
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media