Skip to main content

A Virtual Reality Platform for Musical Creation: GENESIS-RT

  • Conference paper
  • First Online:
Sound, Music, and Motion (CMMR 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8905))

Included in the following conference series:

  • 2108 Accesses

Abstract

We present GENESIS-RT, a Virtual Reality platform conceived for musical creation. It allows the user to (1) interactively create physically-based musical instruments and sounding objects, and (2) play them in real time in a multisensory fashion, by ways of haptics, 3D visualisation during playing, and sound. The design of this platform aims for full physical coupling, or instrumental interaction, between the musician and the simulated instrument. So doing, it differs from both traditional Digital Musical Instrument architectures and Virtual Reality system architectures. By presenting our environment, we discuss several scientific underlying questions: (1) possible ways to manage simultaneous audio-haptic-visual cooperation during real time multisensory simulations; (2) the Computer Aided Design functionalities for the creation of new physically-based musical instruments and sounding objects, and (3) the synchronous real time features, in terms of software and hardware architecture. Finally the article reviews a series of exemplary models and instrumental situations using the proposed platform.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Adrien, J.M.: Etude de structures complexes vibrantes, application à la synthèse par modèles physiques. Ph.D. thesis, Université Paris VI, Paris (1988)

    Google Scholar 

  2. Applied Acoustics Systems. http://www.applied-acoustics.com (2014). Accessed Mar 2014

  3. Berdahl, E., Niemeyer, G., Smith, J.O.: Using haptics to assist performers in making gestures to a musical instrument. In: Proceedings of NIME’09, pp. 177–182. USA (2009)

    Google Scholar 

  4. Berdahl, E., Niemeyer, G., Smith, J.O.: Using haptic devices to interface directly with digital waveguide-based musical instruments. In: Proceedings of the Ninth International Conference on New Interfaces for Musical Expression, pp. 183–186. Pittsburgh, PA (2009)

    Google Scholar 

  5. Berdahl, E., Smith, J.O.: An introduction to the Synth-A-Modeler compiler: Modular and open-source sound synthesis using physical models. In: Proceedings of the Linux Audio Conference (2012)

    Google Scholar 

  6. Berthaut, F., Hachet, M., Desainte-Catherine, M.: Piivert: percussion-based interaction for immersive virtual environments. In: IEEE Symposium on3D User Interfaces (3DUI) 2010, IEEE, pp. 15–18 (2010)

    Google Scholar 

  7. Boissy, J.: Cahier de termes nouveaux: 1992. Conseil international de la langue française (1992)

    Google Scholar 

  8. BRASS. http://www.arturia.com/evolution/en/products/brass/ (2014). Accessed Mar 2014

  9. Cadoz, C., Luciani, A., Florens, J.L.: Responsive input devices and sound synthesis by stimulation of instrumental mechanisms: The CORDIS system. Comput. Music J. 8(3), 60–73 (1984)

    Article  Google Scholar 

  10. Cadoz, C., Lisowski, L., Florens, J.L.: A modular feedback keyboard design. Comput. Music J. 14(2), 47–51 (1990)

    Article  Google Scholar 

  11. Cadoz, C., Luciani, A., Florens, J.L.: CORDIS-ANIMA: a modeling and simulation system for sound and image synthesis: the general formalism. Comput. Music J. 17(1), 19–29 (1993)

    Article  Google Scholar 

  12. Cadoz, C.: Le geste, canal de communication homme/machine: la communication “instrumentale”. Tech. Sci. Informatiques 13, 31–61 (1994)

    Google Scholar 

  13. Cadoz, C., Wanderley, M.M.: Gesture – music. In: Battier, M., Wanderley, M. (eds.) Trends in Gestural Control of Music, pp. 71–93. Editions IRCAM, Paris (2000)

    Google Scholar 

  14. Cadoz, C.: The physical model as metaphor for musical creation: “pico..TERA”, a piece entirely generated by physical model. In: Proceedings of the 2002 International Computer Music Conference (2002)

    Google Scholar 

  15. Castagné, N., Cadoz, C.: Physical modeling synthesis: balance between realism and computing speed. In: Proceedings of the Digital Audio Effects Conference (DAFx-00), Verona, Italy (2000)

    Google Scholar 

  16. Castagne, N., Cadoz, C.: GENESIS: a friendly musician-oriented environment for mass-interaction physical modelling. In: Proceedings of the International Computer Music Conference (ICMC’02), pp. 330–337. Sweden (2002)

    Google Scholar 

  17. Castagne, N., Cadoz, C., Florens, J. L., Luciani, A.: Haptics in computer music: a paradigm shift. In: Proceedings of EuroHaptics (2004)

    Google Scholar 

  18. Castagné, N. Cadoz, C.: A goals-based review of physical modeling. In: Proceedings of the International Computer Music Conference (ICMC’05), pp. 343–346 (2005)

    Google Scholar 

  19. Castagné, N., Cadoz, C., Allaoui, A., Tache, O.: G3: GENESIS software environment update. In: Proceedings of the International Computer Music Conference (ICMC), pp. 407–410 (2009)

    Google Scholar 

  20. Chowning, J.: The synthesis of complex audio spectra by means of frequency modulation. Comput. Music J. 1(2), 46–54 (1977)

    Google Scholar 

  21. Cook, P.R., Scavone, G.: The synthesis toolkit (stk). In: Proceedings of the International Computer Music Conference, pp. 164–166 (1999)

    Google Scholar 

  22. Corbett, R., Van Den Doel, K., Lloyd, J.E., Heidrich, W.: Timbrefields: 3d interactive sound models for real-time audio. Presence: Teleoperators Virtual Environ. 16(6), 643–654 (2007)

    Article  Google Scholar 

  23. Couroussé, D.: Haptic board. In: Luciani, A., Cadoz, C. (eds.) Enaction and Enactive Interfaces: A Handbook of Terms, pp. 126–127. Enactive Systems Books, Grenoble (2007). ISBN 978-2-9530856-0-48

    Google Scholar 

  24. De Poli, G., Piccialli, A., Roads, C.: Representations of Musical Signals. MIT press, Cambridge (1991)

    Google Scholar 

  25. Eckel, G., Iovino, F., Caussé, R.: Sound synthesis by physical modelling with Modalys. In: Proceedings of the International Symposium of Music Acoustics (1995)

    Google Scholar 

  26. Essl, G., O’modhrain, S.: An enactive approach to the design of new tangible musical instruments. Organised sound 11(3), 285–296 (2006)

    Article  Google Scholar 

  27. Florens, J.L.: Real time Bowed String Synthesis with Force Feedback Gesture. In: Invited paper. 585, Mus. 06, vol. 88, Acta Acustica (2002)

    Google Scholar 

  28. Florens, J.L., Luciani, A., Castagne, N., Cadoz, C.: ERGOS: a multi-degrees of freedom and versatile force feedback panoply. In: Proceedings of Eurohpatics 2004, Germany, pp. 356–360 (2004)

    Google Scholar 

  29. Florens, J.L.: Coupleur Gestuel Rétroactif pour la Commande et le Contrôle de Sons Synthétisés en Temps-Réel. Ph.D. thesis, Institut National Polytechnique de Grenoble (1978)

    Google Scholar 

  30. Gillespie, B.: The virtual piano action: design and implementation. In: Proceedings of the International Computer Music Conference (ICMC’94), pp. 167–170 (1994)

    Google Scholar 

  31. Gillespie, B., O’Modhrain, S.: Embodied cognition as a motivating perspective for haptic interaction design: a position paper. In: Proceedings of World Haptics Conference (WHC), pp. 481–486 (2011)

    Google Scholar 

  32. Gutiérrez, T., De Boeck, J.: Haptic board. In: Luciani, A., Cadoz, C. (eds.) Enaction and Enactive Interfaces: A Handbook of Terms, pp. 130–132. Enactive Systems Books, Grenoble (2007). ISBN 978-2-9530856-0-48

    Google Scholar 

  33. Hayward, V., Ashley, O.: Performance measures for haptic interfaces. In: Giralt, G., Hirzinger, G. (eds.) Robotics Research: The 7th International Symposium, pp. 195–207. Springer, London (1996)

    Google Scholar 

  34. Hayward, V., Astley, O.R., Cruz-Hernandez, M., Grant, D., Robles-De-La-Torre, G.: Haptic interfaces and devices. Sens. Rev. 24(1), 16–29 (2004)

    Article  Google Scholar 

  35. Howard, D.M., Rimell, S., Hunt, A.D., Kirk, P.R., Tyrrell, A.M.: Tactile feedback in the control of a physical modelling music synthesiser. In: Proceedings of the 7th International Conference on Music Perception and Cognition, pp. 224–227 (2002)

    Google Scholar 

  36. Hunt, A., Wanderley, M.M., Kirk, R.: Towards a model for instrumental mapping in expert musical interaction. In: Proceedings of the International Computer Music Conference (ICMC-00), pp. 209–212 (2000)

    Google Scholar 

  37. Hunt, A., Kirk, R.: Mapping strategies for musical performance. In: Battier, M., Wanderley, M. (eds.) Trends in Gestural Control of Music, pp. 71–93. Editions IRCAM, Paris (2000)

    Google Scholar 

  38. Hunt, A., Wanderley, M.M., Paradis, M.: The importance of parameter mapping in electronic instrument design. J. New Music Res. 32(4), 429–440 (2003)

    Article  Google Scholar 

  39. Kvifte, T.: On the description of mapping structures. J. New Music Res. 37(4), 353–362 (2008)

    Article  Google Scholar 

  40. Leonard, J., Castagné, N., Cadoz, C., Florens, J.L.: Interactive Physical Design and Haptic Playing of Virtual Musical Instruments. In: Proceedings of the International Computer Music Conference (ICMC’13), Perth, Australia (2013)

    Google Scholar 

  41. Liu, J., Ando, H.: Hearing how you touch: real-time synthesis of contact sounds for multisensory interaction. In: Conference on Human System Interactions, pp. 275–280 (2008)

    Google Scholar 

  42. Lokki, T., Grohn, M.: Navigation with auditory cues in a virtual environment. IEEE Multimedia 12(2), 80–86 (2005)

    Article  Google Scholar 

  43. Luciani, A., Jimenez, S., Florens, J.L., Cadoz, C., Raoult, O.: Computational physics: a modeler simulator for animated physical objects. Proc. Eurographics 91, 425–436 (1991)

    Google Scholar 

  44. Luciani, A., Florens, J.L., Couroussé, D., Castet, J.: Ergotic sounds: a new way to improve playability, believability and presence of virtual musical instruments. J. New Musical Res. 38, 303–323 (2009)

    Google Scholar 

  45. Malloch, J., Sinclair, S., Wanderley, M.M.: Distributed tools for interactive design of heterogeneous signal networks. Multimedia Tools and Applications 2014, 1–25 (2014)

    Google Scholar 

  46. Mathews, M.V.: The digital computer as a music instrument. Science 142(11), 553–557 (1963)

    Article  Google Scholar 

  47. Max/MSP. http://cycling74.com/products/max/ (2014). Accessed Mar 2014

  48. McCartney, J.: Rethinking the computer music language: SuperCollider. Comput. Music J. 26(4), 61–68 (2002)

    Article  MathSciNet  Google Scholar 

  49. Native Instruments Reaktor. www.native-instruments.com/en/products/komplete/synths-samplers/reaktor-5/ (2014). Accessed Mar 2014

  50. Nichols, C.: The vBow: development of a virtual violin bow haptic human-computer interface. In: Proceedings of the 2002 Conference on New Interfaces For Musical Expression (NIME ‘02), pp. 1–4 (2002)

    Google Scholar 

  51. O’Modhrain, M.S., Chafe, C.: Incorporating haptic feedback into interfaces for music applications. In: 8th International Symposium on Robotics with Applications, ISORA, 2000, World Automation Congress WAC (2000)

    Google Scholar 

  52. Poyer, F., Cadoz, C.: Sound synthesis and musical composition by physical modelling of self-sustained oscillating structures. In: Proceedings of the Sound and Music Computing conference (SMC’07), pp. 14–21 (2007)

    Google Scholar 

  53. Puckette, M.: Pure data: another integrated computer music environment. In: Proceedings of the Second Intercollege Computer Music Concerts, pp. 37–41 (1996)

    Google Scholar 

  54. Roads, C. (ed.): The music machine: selected readings from Computer music journal. MIT press, Cambridge (1992)

    Google Scholar 

  55. Roads, C., Pope, S.T., Piccialli, A., De Poli, G. (eds.): Musical signal processing. Swets & Zeitlinger, Leiden (1997). (Reedited by Routledge 2013)

    Google Scholar 

  56. Sinclair, S., Wanderley, M.M.: Defining a control standard for easily integrating haptic virtual environments with existing audio/visual systems. In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, pp. 209–212 (2007)

    Google Scholar 

  57. Sinclair, S., Wanderley, M.M.: A run time programmable simulator to enable multi-modal interaction with rigid body systems. Interact. Comput. 21, 54–63 (2009)

    Article  Google Scholar 

  58. Sinclair, S., Scavone, G.: Audio-haptic interaction with the digital waveguide bowed string. In: Proceedings of the International Computer Music Conference (2009)

    Google Scholar 

  59. Smith, J.O.: Physical modeling using digital waveguides. Comput. Music J. 16(4), 74–91 (1992)

    Article  Google Scholar 

  60. Smith, J.O.: Virtual acoustic musical instruments: review and update. J. New Music Res. 33(3), 283–304 (2004)

    Article  Google Scholar 

  61. Tache, O., Cadoz, C.: Organizing mass-interaction physical models: the cordis-anima musical instrumentarium. In: Proceedings of the International Computer Music Conference, pp. 411–414 (2009)

    Google Scholar 

  62. Uhl, C., Florens, J.L., Luciani, A., Cadoz, C.: Hardware architecture of a real time simulator for the cordis-anima system: physical models, images, gestures and sounds. In: Proceedings of Computer Graphics International ‘95 - Leeds (UK), pp. 421–436 (1995)

    Google Scholar 

  63. Vergez, Christophe, Tisserand, Patrice: The BRASS project, from physical models to virtual musical instruments: playability issues. In: Kronland-Martinet, Richard, Voinier, Thierry, Ystad, Sølvi (eds.) CMMR 2005. LNCS, vol. 3902, pp. 24–33. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  64. Wanderley, M.M., Depalle, P.: Gestural control of sound synthesis. Proc. IEEE 92(4), 632–644 (2004)

    Article  Google Scholar 

Download references

Acknowledgments

This research has been supported by: the French Ministry of Culture; the French Agence Nationale de la Recherche through the cooperative project DYNAMé - ANR-2009-CORD-007; the Grenoble Institute of Technology; and the Doctoral Studies College of Grenoble-Alpes University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to James Leonard .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Leonard, J., Cadoz, C., Castagne, N., Florens, JL., Luciani, A. (2014). A Virtual Reality Platform for Musical Creation: GENESIS-RT. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12976-1_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12975-4

  • Online ISBN: 978-3-319-12976-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics