skip to main content
10.1145/2254556.2254616acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

Embodied cooperation using mobile devices: presenting and evaluating the Sync4All application

Published: 21 May 2012 Publication History

Abstract

Embodied cooperation "arises when two co-present, individuals in motion coordinate their goal-directed actions". The adoption of the embodied cooperation paradigm for the development of embodied and social multimedia systems opens new perspectives for future User Centric Media. Systems for embodied music listening, which enable users to influence music in real-time by movement and gesture, can greatly benefit from the embodied cooperation paradigm. This paper presents the design and the evaluation of an application, Sync4All, based on such a paradigm, allowing users to experience social embodied music listening. Each user rhythmically and freely moves a mobile phone trying to synchronise her movements with those of the other ones. The level of such a synchronisation influences the music experience. The evaluation of Sync4All was aimed at finding out which is the overall attitude of the users towards the application, and how the participants perceived embodied cooperation and music embodiment.

References

[1]
N. Bianchi-Berthouze, W. W. Kim, and D. Patel. Does body movement engage you more in digital game play? and why? In ACII, pages 102--113, 2007.
[2]
G. Bingham, R. Schmidt, M. Turvey, and L. Rosenblum. Task dynamics and resource dynamics in the assembly of coordinated rhythmic activity. Journal of Experimental Psychology: Human Perception and Performance, 17:359--381, 1991.
[3]
T. Blaine and S. Fels. Collaborative musical experiences for novices. Journal of New Music Research, 32:411--428, 2003.
[4]
L. Braaten. Group cohesion: A new multidimensional model. Group, 15:39--55, 1991.
[5]
C. Breazeal. Designing sociable robots. AAAI Press, 2002.
[6]
C. Breazeal, A. Takanishi, and T. Kobayashi. Social robots that interact with people. In Springer Handbook of Robotics, pages 1349--1369. 2008.
[7]
A. Camurri. Interactive dance/music systems. In Proceedings Intl. Computer Music Conference (ICMC-95), pages 245--252, 1995.
[8]
A. M. Dan Pelleg. X-means: Extending k-means with efficient estimation of the number of clusters. In Proceedings of the Seventeenth International Conference on Machine Learning, pages 727--734, San Francisco, 2000. Morgan Kaufmann.
[9]
K. Dautenhahn. Socially intelligent robots: dimensions of human-robot interaction. Philosophical Transactions of the Royal Society B, 362:679--704, 2007.
[10]
D. Glowinski, M. Mancini, and A. Massari. Evaluation of the mobile orchestra explorer paradigm. In Proceedings of the 4th International ICST Conference on Interactive Entertainment. LNICST, Springer, 2012.
[11]
H. Hung and D. Gatica-Perez. Estimating cohesion in small groups using audio-visual nonverbal behavior. IEEE Transactions on Multimedia, 12(6):563--575, 2010.
[12]
H. Hung, Y. Huang, G. Friedland, and D. Gatica-Perez. Estimating dominance in multi-party meetings using speaker diarization. IEEE Transactions on Audio, Speech & Language Processing, 19(4):847--860, 2011.
[13]
S. Jordà. Multi-user instruments: models, examples and promises. In Proceedings of the 2005 conference on New interfaces for musical expression, NIME '05, pages 23--26, Singapore, Singapore, 2005. National University of Singapore.
[14]
S. Jordà, G. Geiger, M. Alonso, and M. Kaltenbrunner. The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the 1st international conference on Tangible and embedded interaction, TEI '07, pages 139--146, New York, NY, USA, 2007. ACM.
[15]
S. Kopp. Social resonance and embodied coordination in face-to-face conversation with artificial interlocutors. Speech Communication, 52(6):587--597, 2010.
[16]
R. Kurzban. The social psychophysics of cooperation: nonverbal communication in a public good game. Journal of Nonverbal Behavior, 25:241--259, 2001.
[17]
I. Laso-Ballesteros and P. D. (Eds.). User Centric Future Media Internet, EU Commission, September 2008.
[18]
M. Leman, M. Demey, M. Lesaffre, L. van Noorden, and D. Moelants. Concepts, technology and assessment of the social music game 'sync-in team'. In Proceedings of the 12th IEEE International Conference on Computational Science and Engineering, 2009.
[19]
S. E. Lindley, J. L. Couteur, and N. Berthouze. Stirring up experience through movement in game play: effects on engagement and social behaviour. In CHI, pages 511--514, 2008.
[20]
K. Marsh, M. Richardson, and R. Schmidt. Social connection through joint action and interpersonal coordination. Topics in Cognitive Science, 1:320--339, 2009.
[21]
Y. Miyake. Interpersonal synchronization of body motion and the walk-mate walking support robot. Robotics, IEEE Transactions on, 25(3):638--644, june 2009.
[22]
M. Myllykoski and P. Paananen. Towards new social dimensions for children's music making - jammo as a collaborative and communal m-learning environment. In Proceedings of the 7th Triennial Conference of European Society for the Cognitive Sciences of Music, ESCOM 2009, pages 366--371, 2009.
[23]
J. Nawrath, M. C. Romano, M. Thiel, I. Z. Kiss, M. Wickramasinghe, J. Timmer, J. Kurths, and B. Schelter. Distinguishing Direct from Indirect Interactions in Oscillatory Networks with Multiple Time Scales. Physical Review Letters, 104(3):038701+, 2010.
[24]
B. Newman, J. Sanders, R. Hughes, and R. Jurdak. Tinytune, a collaborative sensor network musical instrument. In Proceedings of the 8th ACM Conference on Embedded Networked Sensor Systems, SenSys '10, pages 375--376, New York, NY, USA, 2010. ACM.
[25]
N. Marwan, M. C. Romano, M. Thiel, and J. Kurths. Recurrence plots for the analysis of complex systems. Physics Reports, 438:237--329, 2007.
[26]
J. Patten, B. Recht, and H. Ishii. Audiopad: a tag-based interface for musical performance. In Proceedings of the 2002 conference on New interfaces for musical expression, NIME '02, pages 1--6, Singapore, Singapore, 2002. National University of Singapore.
[27]
A. Pentland. Social signal processing. IEEE Signal Proc. Mag., 2(4):108--111, 2007.
[28]
M. Richardson, K. Marsh, R. Isenhower, J. Goodman, and R. Schmidt. Rocking together: Dynamics of unintentional and intentional interpersonal coordination. Human Movement Science, 26:867--891, 2007.
[29]
M. Richardson, K. Marsh, and R. Schmidt. Effects of visual and verbal interaction on unintentional interpersonal coordination. Journal of Experimental Psychology: Human Perception and Performance, 31:62--79, 2005.
[30]
M. Romano, M. Thiel, J. Kurths, I. Kiss, and J. Hudson. Detection of synchronisation for non-phase coherent and non-stationarity data. Europhysics Letters, 71(3):466--472, 2005.
[31]
R. Schmidt and B. O'Brien. Evaluating the dynamics of unintended interpersonal coordination. Ecological Psychology, 9:189--206, 1997.
[32]
J. Stockholm and P. Pasquier. Reinforcement learning of listener response for mood classification of audio. In Proceedings of the 12th IEEE International Conference on Computational Science and Engineering, 2009.
[33]
A. Vinciarelli, M. Pantic, H. Bourland, and A. Pentland. Social signals, their function, and automatic analysis: a survey. In Proceedings of the 10th International Conference on Multimodal Interfaces, ICMI08, pages 61--68, 2008.
[34]
M. Vinyes, J. Bonada, and A. Loscos. Demixing commercial music productions via human-assisted time-frequency masking. In Proceedings of the 120th AES Convention, 2006.
[35]
G. Volpe and A. Camurri. A system for embodied social active listening to sound and music content. ACM Journal on Computing and Cultural Heritage, 4:2:1--2:23, 2011.

Cited By

View all
  • (2023)Mediating Interpersonal Synchronization in Children through a Full-Body Mixed Reality System: Analysis of the Pre-Interactive Mandala ExperiencePRESENCE: Virtual and Augmented Reality10.1162/pres_a_0038632(35-51)Online publication date: 1-Dec-2023
  • (2021)Designing for interpersonal motor synchronizationHuman–Computer Interaction10.1080/07370024.2021.1912608(1-48)Online publication date: 6-Jun-2021
  • (2015)The BeatHealth ProjectJournal of Cases on Information Technology10.4018/JCIT.201510010317:4(29-52)Online publication date: Oct-2015

Index Terms

  1. Embodied cooperation using mobile devices: presenting and evaluating the Sync4All application

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AVI '12: Proceedings of the International Working Conference on Advanced Visual Interfaces
    May 2012
    846 pages
    ISBN:9781450312875
    DOI:10.1145/2254556
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Consulta Umbria SRL
    • University of Salerno: University of Salerno

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. embodied cooperation
    2. mobile music applications
    3. nonverbal social behaviour

    Qualifiers

    • Research-article

    Conference

    AVI'12
    Sponsor:
    • University of Salerno

    Acceptance Rates

    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Mediating Interpersonal Synchronization in Children through a Full-Body Mixed Reality System: Analysis of the Pre-Interactive Mandala ExperiencePRESENCE: Virtual and Augmented Reality10.1162/pres_a_0038632(35-51)Online publication date: 1-Dec-2023
    • (2021)Designing for interpersonal motor synchronizationHuman–Computer Interaction10.1080/07370024.2021.1912608(1-48)Online publication date: 6-Jun-2021
    • (2015)The BeatHealth ProjectJournal of Cases on Information Technology10.4018/JCIT.201510010317:4(29-52)Online publication date: Oct-2015

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media