skip to main content
10.1145/3401956.3404236acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

Physical Time: A Model for Generating Rhythmic Gestures Based on Time Metaphors

Authors Info & Claims
Published:15 July 2020Publication History

ABSTRACT

Possibilities for cross disciplinary interactive performance continue to grow as new tools are developed and adapted. Yet, the qualitative aspects of cross disciplinary interaction has not advanced at the same rate. We suggest that new models for understanding gesture in different media will support the development of nuanced interaction for interactive performance. We have explored this premise by considering models for generating musical rhythmic gestures that enable implicit interaction between the gestures of a dancer and the generated music. We create a model that focuses on understanding rhythms as dynamic gestures that flow in, around, or out of goal points. Goal points can be layered and quantized to a meter, providing the rhythmic structure expected in music, while the figurations enable the generated rhythms to flow with the performer responding to the more qualitative aspects of performer. We have made a simple implementation of this model to test the conceptual and technical viability. We discuss both the model and our implementations suggesting that the model, even with a simple implementation, affords a unique ability to reflect the dynamic flow of gestures in movement paradigms while still providing a sense of structured time indicative of a musical paradigm.

References

  1. Stephen Adams. 1997. Poetic Designs: An Introduction to Meters, Verse Forms, and Figures of Speech. Broadview Press.Google ScholarGoogle Scholar
  2. Milton Babbitt. 1962. Twelve-Tone Rhythmic Structure and the Electronic Medium. Perspectives of New Music 1, 1 (1962), 30.Google ScholarGoogle ScholarCross RefCross Ref
  3. Anne Bogart and Tina Landau. 2005. The Viewpoints Book: A Practical Guide to Viewpoints and Composition. Theatre Communications Group.Google ScholarGoogle Scholar
  4. Karen K Bradley. 2008. Rudolph Laban. Routledge, London.Google ScholarGoogle Scholar
  5. Antonio Camurri, Giovanni De Poli, Anders Friberg, Marc Leman, and Gualtiero Volpe. 2005. The MEGA project: Analysis and synthesis of multisensory expressive gesture in performing art applications. Journal of New Music Research 34, 1 (2005), 5--21.Google ScholarGoogle ScholarCross RefCross Ref
  6. Mark Coniglio and Dawn Stoppiello. 2009. Troika Ranch Website. (2009).Google ScholarGoogle Scholar
  7. Gregory Corness. 2013. Breath as an Embodied Connection for Performer-System Collaborative Improvisation. PhD Dissertation. Simon F, Surrey.Google ScholarGoogle Scholar
  8. A. Elmsley (Lambert), T. Weyde, and N. Armstrong. 2017. Generating Time: Rhythmic Perception, Prediction and Production with Recurrent Neural Networks. Journal of Creative Music Systems 1, 2 (March 2017). https://doi.org/10.5920/JCMS.2017.04Google ScholarGoogle Scholar
  9. Allen Forte, Allan Forte, and Steven E. Gilbert. 1982. Introduction to Schenkerian Analysis: Form and Content in Tonal Music. W. W. Norton & Company.Google ScholarGoogle Scholar
  10. Linda Hartley. 1989. Wisdom of the Body Moving: An Introduction to Body-mind Centering. North Atlantic Books.Google ScholarGoogle Scholar
  11. Margaret H'Doubler. 1940. Dance: A Creative Art Experience. The University of Wisconsin Press.Google ScholarGoogle Scholar
  12. Mark Johnson. 1990. The Body in the Mind: The Bodily Basis of Meaning, Imagination, and Reason. University Of Chicago Press.Google ScholarGoogle Scholar
  13. George Lakoff and Mark Johnson. 1999. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. Basic Books.Google ScholarGoogle Scholar
  14. Fred Lerdahl and Ray Jackendoff. 1983. A Generative Theory of Tonal Music. MIT Press.Google ScholarGoogle Scholar
  15. Caroline Palmer and Carol Krumhansl. 1990. Mental Representations for Musical Meter. Journal of Experimental Psychology: Human Perception and Performance. 16, 4 (1990).Google ScholarGoogle ScholarCross RefCross Ref
  16. David Rokeby. 1995. Transforming mirrors. Leonardo Electronic Almanac 3, 4 (1995), 12.Google ScholarGoogle Scholar
  17. Valerio Saggini. 2002. Mapping Human Gesture into Electronic Media: An Interview with Mark Coniglio of Troika Ranch. Theremin Vox: Art, Technology and Gesture (2002).Google ScholarGoogle Scholar
  18. Karen Studd and Laura L Cox. 2013. Everybody is a body. Dog Ear Publishing, Indianapolis, IN.Google ScholarGoogle Scholar
  19. Robert Wechsler. 2006. Artistic considerations in the use of motion tracking with live performers: A practical guide. In Performance and Technology. Springer, 60--77.Google ScholarGoogle Scholar
  20. Robert Wechsler, Frieder Weiß, and Peter Dowling. 2004. EyeCon-A motion sensing tool for creating interactive dance, music, and video projections. In Proc. of the SSAISB Convention. Citeseer.Google ScholarGoogle Scholar
  21. Gil Weinberg and Scott Driscoll. 2007. The interactive robotic percussionist-new developments in form, mechanics, perception and interaction design. In 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 97--104.Google ScholarGoogle Scholar

Index Terms

  1. Physical Time: A Model for Generating Rhythmic Gestures Based on Time Metaphors

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      MOCO '20: Proceedings of the 7th International Conference on Movement and Computing
      July 2020
      205 pages
      ISBN:9781450375054
      DOI:10.1145/3401956

      Copyright © 2020 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 15 July 2020

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate50of110submissions,45%
    • Article Metrics

      • Downloads (Last 12 months)5
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader