skip to main content
10.1145/2702123.2702374acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

As Light as your Footsteps: Altering Walking Sounds to Change Perceived Body Weight, Emotional State and Gait

Published: 18 April 2015 Publication History

Abstract

An ever more sedentary lifestyle is a serious problem in our society. Enhancing people's exercise adherence through technology remains an important research challenge. We propose a novel approach for a system supporting walking that draws from basic findings in neuroscience research. Our shoe-based prototype senses a person's footsteps and alters in real-time the frequency spectra of the sound they produce while walking. The resulting sounds are consistent with those produced by either a lighter or heavier body. Our user study showed that modified walking sounds change one's own perceived body weight and lead to a related gait pattern. In particular, augmenting the high frequencies of the sound leads to the perception of having a thinner body and enhances the motivation for physical activity inducing a more dynamic swing and a shorter heel strike. We here discuss the opportunities and the questions our findings open.

References

[1]
Bandura, A. Guide for constructing self-efficacy scales. Self-efficacy beliefs of adolescents 5 (2006), 307--337.
[2]
Bianchi-Berthouze, N. (2013). Understanding the role of body movement in player engagement. Human Computer Interaction 28(1), 42--75.
[3]
Botvinick, M., and Cohen, J. Rubber hands 'feel' touch that eyes see. Nature 391 (1998), 756.
[4]
Boucsein, W. Electrodermal activity. New York: Plenum press, 1992.
[5]
Bradley M.M., and Lang, P.J. Affective reactions to acoustic stimuli. Psychophysiology 37(2000), 204--215.
[6]
Bradley M.M., and Lang, P.J. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1(1994), 49--59.
[7]
Bresin, R., de Witt, A., Papetti, S., Civolani, M., and Fontana, F. Expressive sonification of footstep sounds. Proc. ISON (2010), 51--54.
[8]
Carney, D.R., Cuddy, A.J., and Yap, A.J. Power posing brief nonverbal displays affect neuroendocrine levels and risk tolerance. Psychological Science 21(2010), 1363--1368.
[9]
Carruthers, G. Types of body representation and the sense of embodiment. Consciousness and Cognition 17(2008), 1302--1316
[10]
Cazzato, V., Mian, E., Serino, A., Mele, S., and Urgesi, C. Distinct contributions of extrastriate body area and temporoparietal junction in perceiving one's own and others' body. Cognitive, Affective, & Behavioral Neuroscience (2014), 1--18.
[11]
Cesarini, D., Hermann, T., and Ungerechts, B. A realtime auditory biofeedback system for sports swimming. Proc. ICAD (2014).
[12]
Cunado, D., Nixon, M. S., and Carter, J.N. Automatic extraction and description of human gait models for recognition purposes. Computer Vision and Image Understanding 90, 1(2003), 1--41.
[13]
de Vignemont, F., Ehrsson, H.H., and Haggard, P. Bodily illusions modulate tactile perception. Current Biology 15, 14(2005), 1286--1290.
[14]
Furfaro, E., Bianchi-Berthouze, N., Bevilacqua, F., and Tajadura-Jiménez, A. Sonification of surface tapping: Influences on behaviour, emotion and surface perception. Proc. ISON (2013), 21--18.
[15]
Galbraith F., and Barton M. Ground loading from footsteps. JASA 48, 5B (1970).
[16]
Gallagher, S. How the body shapes the mind. Clarendon Press, Oxford; New York, 2005.
[17]
Giordano, B.L., and Bresin, R. (2006). Walking and playing: what's the origin of emotional expressiveness in music? Proc. ICMPC9 (2006), 149.
[18]
Giordano, B.L., Visell, Y., Yao, H.Y., Hayward, V., Cooperstock, J.R., and McAdams, S. Identification of walked-upon materials in auditory, kinesthetic, haptic, and audio-haptic conditions. JASA 131(2012), 4002--12.
[19]
Großhauser, T., Bläsing, B., Spieth, C., Hermann, T. Wearable sensor-based real-time sonification of motion and foot pressure in dance teaching and training. J. of the Audio Engineering Society 60, 7/8(2012),580--589.
[20]
Guest S., Catmur C., Lloyd D., and Spence C. Audiotactile interactions in roughness perception. Experimental Brain Research 146 (2002), 161--171.
[21]
Harrisson, D., Marshall, P., Bianchi-Berthouze, N., and Bird, J. Tracking physical activity: Problems related to running longitudinal studies with commercial devices. Proc. Ubicomp & ISWC, ACM Press (2014), 699--702.
[22]
Hennighausen, K., Enkelmann, D., Wewetzer, C., and Remschmidt, H. Body image distortion in Anorexia Nervosa--is there really a perceptual deficit? European Child & Adolescent Psychiatry, 8 (1999), 200--206.
[23]
ISO 226:2003. Acoustics -- Normal equal-loudnesslevel contours.
[24]
Jousmäki, V., and Hari, R. (1998). Parchment-skin illusion: sound-biased touch. Current Biology 8(1998), R190--191.
[25]
Ko, S.-u., Stenholm, S., and Ferrucci, L. Characteristic gait patterns in older adults with obesity-Results from the Baltimore Longitudinal Study of Aging. Journal of Biomechanics, 43(2010), 1104--1110.
[26]
Kurihara, Y., Hachisu, T., Kuchenbecker, K.J., Kajimoto, H. Jointonation: robotization of the human body by vibrotactile feedback. Proc. SIGGRAPH, ACM Press (2013).
[27]
Legenbauer, T., Vocks, S., Betz, S., Baguena Puigcerver, M.J., Benecke, A., Troje, N.F., and Ruddel, H. Differences in the nature of body image disturbances between female obese individuals with versus without a comorbid binge eating disorder: an exploratory study including static and dynamic aspects of body image. Behavior Modification, 35, 2 (2011), 162--186.
[28]
Leman, M., Moelants, D., Varewyck, M., Styns, F., van Noorden, L., and Martens, J.P. Activating and relaxing music entrains the speed of beat synchronized walking. PloS one 8, 7(2013), e67932.
[29]
Li, X.F., Logan, R.J., and Pastore, R.E. Perception of acoustic source characteristics: walking sounds. JASA 90, 6(1991), 3036--3049.
[30]
Magill, R.A., and Anderson, D.I. The roles and uses of augmented feedback in motor skill acquisition. Skill Acquisition in Sport: Research, Theory and Practice. N. Hodges, A.M. Williams, eds. Routledge, 2012.
[31]
Menzer, F., Brooks, A., Halje, P., Faller, C., Vetterli, M., and Blanke, O. Feeling in control of your footsteps: Conscious gait monitoring and the auditory consequences of footsteps. Cognitive Neuroscience 1(2010), 184--192.
[32]
NHS, Cosmetic surgery. www.nhs.uk/conditions/Cosmetic-surgery
[33]
NHS, Health and fitness. www.nhs.uk/livewell/fitness
[34]
NHS, Walking for Health. www.nhs.uk/Livewell
[35]
Nordahl, R. (2006). Increasing the motion of users in photo-realistic virtual environments by utilising auditory rendering of the environment and ego-motion. Proc. Presence (2006), 57--63.
[36]
OECD. Health at a Glance 2013: OECD Indicators. OECD Publishing, 2013.
[37]
Pastore, R.E., Flint, J.D., Gaston, J.R., and Solomon, M.J. Auditory event perception: the source-perception loop for posture in human gait. Perception & psychophysics 70(2008), 13--29.
[38]
Petkova, V.I. and Ehrsson, H.H. If I were you: Perceptual illusion of body swapping. PLoS ONE 3, 12(2008).
[39]
Piryankova, I.V., Stefanucci, J.K., Romero, J., Rosa, S. D.L., Black, M.J., and Mohler, B.J. (2014). Can I recognize my body's weight? The influence of shape and texture on the perception of self. ACM Trans. Appl. Percept. 11(2015), 1--18.
[40]
Polich, J. Habituation of P300 from auditory stimuli. Psychobiology 17, 1 (1989), 19--28.
[41]
Pollatos, O., Kurz, A.-L., Albrecht, J., Schreder, T., et al. Reduced perception of bodily signals in anorexia nervosa. Eating Behaviours 9, 4(2008), 381--388.
[42]
Rosati, G., Rodà, A., Avanzini, F., and Masiero, S. On the role of auditory feedback in robotic-assisted movement training after stroke. Computational Intelligence and Neuroscience, (2013), ID586138.
[43]
Sakurai, S., Katsumura, T., Narumi, T., Tanikawa, T., Hirose, M. Interactonia balloon. SIGGRAPH (2012).
[44]
Sanchez-Vives, M.V., and Slater, M. From presence to consciousness through virtual reality. Nature Reviews Neuroscience 6, 4(2005), 332--339.
[45]
Savva, N., Scarinzi, A., and Bianchi-Berthouze, N. (2012). Continuous recognition of player's affective body expression as dynamic quality of aesthetic experience. IEEE IEEE Transactions on Computational Intelligence and AI in Games 4(3), 199--212
[46]
Schaffert, N., Mattes, K., and Effenberg, A.O. Listen to the boat motion: acoustic information for elite rowers. Proc. ISON (2010), 31--38.
[47]
Senna, I., Maravita, A., Bolognini, N., and Parise, C.V. (2014). The Marble-Hand Illusion. PloS one 9, 3 (2014).
[48]
Sigrist, R., Rauter, G., Riener, R., Wolf, P. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin & Review 20, 1 (2013), 21--53.
[49]
Singh, A., Klapper, A., Jia, J., Fidalgo, A., TajaduraJimenez, A. et al. Motivating people with chronic pain to do physical activity: opportunities for technology design. Proc. CHI'14, ACM Press (2014), 2803--2812.
[50]
Styns, F., van Noorden, L., Moelants, D., and Leman, M. Walking on music. Human movement science 26, 5(2007), 769--785.
[51]
Tajadura-Jiménez, A., Väljamäe, A., Toshima, I., Kimura, T., Tsakiris, M., and Kitagawa, N. Action sounds recalibrate perceived tactile distance. Current Biology 22, 13(2012), R516--R517.
[52]
Tiggemann, M. Gender differences in the interrelationships between weight dissatisfaction, restraint, and selfesteem. Sex Roles 30 (1994), 319--330.
[53]
Tonetto, P.L.M., Klanovicz, C.P., Spence, C. Modifying action sounds influences people's emotional responses and bodily sensations. i-Perception, 5(2014), 153--163.
[54]
Troje, N.F. (2008) Retrieving information from human movement patterns. In Understanding Events: How Humans See, Represent, and Act on Events (pp. 308--334). Oxford University Press, 2008.
[55]
Tsakiris, M. My body in the brain: a neurocognitive model of body ownership. Neuropsychologia 48, 3(2010), 703--712.
[56]
Vaughan, C.L., Davis, B.L., and O'connor, J.C. Dynamics of human gait. Champaign, Illinois: Human Kinetics Publishers, 1992 15--43.
[57]
Visell, Y., Fontana, F., Giordano, B. L., Nordahl, R., Serafin, S., and Bresin, R. Sound design and perception in walking interactions. International Journal of Human-Computer Studies 67, 11(2009), 947--959.
[58]
Vogt, K., Pirro, D., Kobenz, I., Holdrich, R., and Eckel, G. PhysioSonic: evaluated movement sonification as auditory feedback in physiotherapy. Proc. CMMR/ICAD, (2009), 103--120.
[59]
Watanabe, J., and Hideyuki, A. Pace-sync shoes: intuitive walking-pace guidance based on cyclic vibro-tactile stimulation for the foot. Virtual Reality 14(2010), 213--9.
[60]
Wolpert, D.M., and Ghahramani, Z. Computational principles of movement neuroscience. Nature Neuroscience 3(2000), 1212--1217.
[61]
World Health Organization (2009). Global health risks: mortality and burden of disease attributable to selected major risks. Geneva, 2009.
[62]
Zanotto, D., Rosati, G., Spagnol, S., Stegall, P., and Agrawal, S.K. Effects of complementary auditory feedback in robot-assisted lower extremity motor adaptation. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 21, 5(2013), 775--786.

Cited By

View all
  • (2024)Exploring the Alteration and Masking of Everyday Noise Sounds using Auditory Augmented RealityProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685750(154-163)Online publication date: 4-Nov-2024
  • (2024)Articulating body experiences in reaction to movement sonifications: A workshop strategy for early research inquiriesProceedings of the 19th International Audio Mostly Conference: Explorations in Sonic Cultures10.1145/3678299.3678349(487-491)Online publication date: 18-Sep-2024
  • (2024)Pushed by Sound: Effects of Sound and Movement Direction on Body Perception, Experience Quality, and Exercise SupportACM Transactions on Computer-Human Interaction10.1145/364861631:4(1-36)Online publication date: 19-Sep-2024
  • Show More Cited By

Index Terms

  1. As Light as your Footsteps: Altering Walking Sounds to Change Perceived Body Weight, Emotional State and Gait

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
    April 2015
    4290 pages
    ISBN:9781450331456
    DOI:10.1145/2702123
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 April 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. auditory body perception
    2. emotion
    3. evaluation method
    4. interaction styles
    5. multimodal interfaces
    6. sonifica-tion

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHI '15
    Sponsor:
    CHI '15: CHI Conference on Human Factors in Computing Systems
    April 18 - 23, 2015
    Seoul, Republic of Korea

    Acceptance Rates

    CHI '15 Paper Acceptance Rate 486 of 2,120 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)214
    • Downloads (Last 6 weeks)14
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring the Alteration and Masking of Everyday Noise Sounds using Auditory Augmented RealityProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685750(154-163)Online publication date: 4-Nov-2024
    • (2024)Articulating body experiences in reaction to movement sonifications: A workshop strategy for early research inquiriesProceedings of the 19th International Audio Mostly Conference: Explorations in Sonic Cultures10.1145/3678299.3678349(487-491)Online publication date: 18-Sep-2024
    • (2024)Pushed by Sound: Effects of Sound and Movement Direction on Body Perception, Experience Quality, and Exercise SupportACM Transactions on Computer-Human Interaction10.1145/364861631:4(1-36)Online publication date: 19-Sep-2024
    • (2024)Hicclip: Sonification of Augmented Eating Sounds to Intervene Snacking BehaviorsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661532(1372-1384)Online publication date: 1-Jul-2024
    • (2024)Body Sensations as Design Material: An Approach to Design Sensory Technology for Altering Body PerceptionProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660701(2545-2561)Online publication date: 1-Jul-2024
    • (2024)Co-Designing Sensory Feedback for Wearables to Support Physical Activity through Body SensationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36434998:1(1-31)Online publication date: 6-Mar-2024
    • (2024)Body Transformation: An Experiential Quality of Sensory Feedback Wearables for Altering Body PerceptionProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633373(1-19)Online publication date: 11-Feb-2024
    • (2024)SoniWeight Shoes: Investigating Effects and Personalization of a Wearable Sound Device for Altering Body Perception and BehaviorProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642651(1-20)Online publication date: 11-May-2024
    • (2024)Investigating Effect of Altered Auditory Feedback on Self-Representation, Subjective Operator Experience, and Task Performance in Teleoperation of a Social RobotProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642561(1-18)Online publication date: 11-May-2024
    • (2024)Motionless Movement: Towards Vibrotactile Kinesthetic DisplaysProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642499(1-16)Online publication date: 11-May-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media