ABSTRACT
Possibilities for cross disciplinary interactive performance continue to grow as new tools are developed and adapted. Yet, the qualitative aspects of cross disciplinary interaction has not advanced at the same rate. We suggest that new models for understanding gesture in different media will support the development of nuanced interaction for interactive performance. We have explored this premise by considering models for generating musical rhythmic gestures that enable implicit interaction between the gestures of a dancer and the generated music. We create a model that focuses on understanding rhythms as dynamic gestures that flow in, around, or out of goal points. Goal points can be layered and quantized to a meter, providing the rhythmic structure expected in music, while the figurations enable the generated rhythms to flow with the performer responding to the more qualitative aspects of performer. We have made a simple implementation of this model to test the conceptual and technical viability. We discuss both the model and our implementations suggesting that the model, even with a simple implementation, affords a unique ability to reflect the dynamic flow of gestures in movement paradigms while still providing a sense of structured time indicative of a musical paradigm.
- Stephen Adams. 1997. Poetic Designs: An Introduction to Meters, Verse Forms, and Figures of Speech. Broadview Press.Google Scholar
- Milton Babbitt. 1962. Twelve-Tone Rhythmic Structure and the Electronic Medium. Perspectives of New Music 1, 1 (1962), 30.Google ScholarCross Ref
- Anne Bogart and Tina Landau. 2005. The Viewpoints Book: A Practical Guide to Viewpoints and Composition. Theatre Communications Group.Google Scholar
- Karen K Bradley. 2008. Rudolph Laban. Routledge, London.Google Scholar
- Antonio Camurri, Giovanni De Poli, Anders Friberg, Marc Leman, and Gualtiero Volpe. 2005. The MEGA project: Analysis and synthesis of multisensory expressive gesture in performing art applications. Journal of New Music Research 34, 1 (2005), 5--21.Google ScholarCross Ref
- Mark Coniglio and Dawn Stoppiello. 2009. Troika Ranch Website. (2009).Google Scholar
- Gregory Corness. 2013. Breath as an Embodied Connection for Performer-System Collaborative Improvisation. PhD Dissertation. Simon F, Surrey.Google Scholar
- A. Elmsley (Lambert), T. Weyde, and N. Armstrong. 2017. Generating Time: Rhythmic Perception, Prediction and Production with Recurrent Neural Networks. Journal of Creative Music Systems 1, 2 (March 2017). https://doi.org/10.5920/JCMS.2017.04Google Scholar
- Allen Forte, Allan Forte, and Steven E. Gilbert. 1982. Introduction to Schenkerian Analysis: Form and Content in Tonal Music. W. W. Norton & Company.Google Scholar
- Linda Hartley. 1989. Wisdom of the Body Moving: An Introduction to Body-mind Centering. North Atlantic Books.Google Scholar
- Margaret H'Doubler. 1940. Dance: A Creative Art Experience. The University of Wisconsin Press.Google Scholar
- Mark Johnson. 1990. The Body in the Mind: The Bodily Basis of Meaning, Imagination, and Reason. University Of Chicago Press.Google Scholar
- George Lakoff and Mark Johnson. 1999. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. Basic Books.Google Scholar
- Fred Lerdahl and Ray Jackendoff. 1983. A Generative Theory of Tonal Music. MIT Press.Google Scholar
- Caroline Palmer and Carol Krumhansl. 1990. Mental Representations for Musical Meter. Journal of Experimental Psychology: Human Perception and Performance. 16, 4 (1990).Google ScholarCross Ref
- David Rokeby. 1995. Transforming mirrors. Leonardo Electronic Almanac 3, 4 (1995), 12.Google Scholar
- Valerio Saggini. 2002. Mapping Human Gesture into Electronic Media: An Interview with Mark Coniglio of Troika Ranch. Theremin Vox: Art, Technology and Gesture (2002).Google Scholar
- Karen Studd and Laura L Cox. 2013. Everybody is a body. Dog Ear Publishing, Indianapolis, IN.Google Scholar
- Robert Wechsler. 2006. Artistic considerations in the use of motion tracking with live performers: A practical guide. In Performance and Technology. Springer, 60--77.Google Scholar
- Robert Wechsler, Frieder Weiß, and Peter Dowling. 2004. EyeCon-A motion sensing tool for creating interactive dance, music, and video projections. In Proc. of the SSAISB Convention. Citeseer.Google Scholar
- Gil Weinberg and Scott Driscoll. 2007. The interactive robotic percussionist-new developments in form, mechanics, perception and interaction design. In 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 97--104.Google Scholar
Index Terms
- Physical Time: A Model for Generating Rhythmic Gestures Based on Time Metaphors
Recommendations
Physicalizing time through orientational metaphors for generating rhythmic gestures
EVA '18: Proceedings of the Conference on Electronic Visualisation and the ArtsPossibilities for cross-disciplinary interactive performance continue to grow as new tools are developed and adapted. Yet, the qualitative aspects of cross-disciplinary interaction have not advanced at the same rate. We suggest that new models for ...
Musicians' initial encounters with a smart guitar
NordiCHI '18: Proceedings of the 10th Nordic Conference on Human-Computer InteractionThis paper presents a case study of a fully working prototype of the Sensus smart guitar. Eleven professional guitar players were interviewed after a prototype test session. The smartness of the guitar was perceived as enabling the integration of a ...
Alignment: improvised gestural performance
C&C '13: Proceedings of the 9th ACM Conference on Creativity & CognitionThis paper outlines the aims and technology behind an improvised performance combining electronic drums and piano with gestural control.
The performance involves a gesturally-controlled audiovisual system drawing on ancillary, expressive or nonsound ...
Comments