skip to main content
10.1145/3206505.3206511acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

Effect of temporality, physical activity and cognitive load on spatiotemporal vibrotactile pattern recognition

Published: 29 May 2018 Publication History

Abstract

Previous research demonstrated the ability for users to accurately recognize Spatiotemporal Vibrotactile Patterns (SVP): sequences of vibrations on different motors occurring either sequentially or simultaneously. However, the experiments were only run in a lab setting and the ability for users to recognize SVP in a real-world environment remains unclear. In this paper, we investigate how several factors may affect recognition: (1) physical activity (running), (2) cognitive task (i.e. primary task, typing), (3) distribution of the vibration motors across body parts and (4) temporality of the patterns. Our results suggest that physical activity has very little impact, specifically compared to cognitive task, location of the vibrations or temporality. We discuss these results and propose a set of guidelines for the design of SVPs.

References

[1]
Jessalyn Alvina, Shengdong Zhao, Simon T. Perrault, Maryam Azh, Thijs Roumen, and Morten Fjeld. 2015. OmniVib: Towards Cross-body Spatiotemporal Vibrotactile Notifications for Mobile Phones. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15 (CHI '15), 2487--2496.
[2]
Hans Jørgen Andersen, Ann Judith Morrison, and Lars Knudsen. 2012. Modeling vibrotactile detection by logistic regression. In Proceedings of the 7th Nordic Conference on Human-Computer Interaction, 500--503.
[3]
M Bikah, M S Hallbeck, and J H Flowers. 2006. Evaluation of Human Vibration Thresholds at Various Body Loci. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 16: 1557--1561.
[4]
Jeffrey R. Blum, Ilja Frissen, and Jeremy R. Cooperstock. 2015. Improving Haptic Feedback on Wearable Devices through Accelerometer Measurements. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15: 31--36.
[5]
Stephen S.A. Brewster and L.M. Lorna M. Brown. 2004. Tactons: structured tactile messages for non-visual information display. AUIC'04 Proceedings of the fifth conference on Australasian user interface: 15--23. Retrieved from www.dcs.gla.ac.uk/~stephen
[6]
Marta G. Carcedo, Soon Hau Chua, Simon Perrault, Paweł Wozniak, Raj Joshi, Mohammad Obaid, Morten Fjeld, and Shengdong Zhao. 2016. HaptiColor: Interpolating Color Information as Haptic Feedback to Assist the Colorblind. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16, 3572--3583.
[7]
Jessica R. Cauchard, Janette L. Cheng, Thomas Pietrzak, and James A. Landay. 2016. ActiVibe: Design and Evaluation of Vibrations for Progress Monitoring. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16, 3261--3271.
[8]
Andrew Chan, Karon Maclean, and Joanna McGrenere. 2005. Learning and identifying haptic icons under workload. Proceedings - 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; World Haptics Conference, WHC 2005: 432--439.
[9]
C Elaine Chapman, Iolanda C Zompa, Stephan R Williams, Jafar Shenasa, and Wan Jiang. 1996. Factors influencing the perception of tactile stimuli during movement. In Somesthesis and the Neurobiology of the Somatosensory Cortex, O Franzén, R Johansson and L Terenius (eds.). Birkh{ä}user Basel, Basel, 307--320.
[10]
Roger W. Cholewiak and Amy A. Collins. 2003. Vibrotactile localization on the arm: Effects of place, space, and age. Perception & psychophysics 65, 7: 1058--1077.
[11]
Gregory O Gibson and James C Craig. 2005. Tactile spatial sensitivity and anisotropy. Perception {&} Psychophysics 67, 6: 1061--1079.
[12]
Aakar Gupta, Thomas Pietrzak, Nicolas Roussel, and Ravin Balakrishnan. 2016. Direct Manipulation in Tactile Displays. In Proceedings of the 34th Annual ACM Conference on Human Factors in Computing Systems - CHI' 16, 1--11. https://doi.org/
[13]
Cristy Ho, Hong Z. Tan, and Charles Spence. 2005. Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F: Traffic Psychology and Behaviour 8, 6: 397--412.
[14]
Jaeyeon Lee, Jaehyun Han, and Geehyuk Lee. 2015. Investigating the Information Transfer Efficiency of a 3x3 Watch-back Tactile Display. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15, 1229--1232.
[15]
Seungyon "Claire" Claire Lee and Thad Starner. 2010. BuzzWear: alert perception in wearable tactile displays on the wrist. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '10, 433--442.
[16]
Yi-Chi Liao, Yi-Ling Chen, Jo-Yu Lo, Rong-Hao Liang, Liwei Chan, and Bing-Yu Chen. 2016. EdgeVib: effective alphanumeric character output using a wrist-worn tactile display. Proceedings of the 29th Annual Symposium on User Interface Software and Technology - UIST '16: 595--601.
[17]
Michael Matscheko, Alois Ferscha, Andreas Riener, and Manuel Lehner. 2010. Tactor Placement in Wrist Worn Wearables. In 14th annual IEEE International Symposium on Wearable Computers (ISWC{'}10), October 10-13, Seoul, South Korea, 8.
[18]
Toni Pakkanen, Jani Lylykangas, Jukka Raisamo, Roope Raisamo, Katri Salminen, Jussi Rantala, and Veikko Surakka. 2008. Perception of low-amplitude haptic stimuli when biking. Proceedings of the 10th international conference on Multimodal interfaces - IMCI '08: 281.
[19]
Jerome Pasquero, Scott J. Stobbe, and Noel Stonehouse. 2011. A haptic wristwatch for eyes-free interactions. In Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11, 3257--3266.
[20]
Huimin Qian, Ravi Kuber, Andrew Sears, and Elizabeth Stanwyck. 2014. Determining the efficacy of multi-parameter tactons in the presence of real-world and simulated audio distractors. Interacting with Computers 26, 6: 572--594.
[21]
J Rosenthal, N Edwards, D Villanueva, S Krishna, T McDaniel, and S Panchanathan. 2011. Design, Implementation, and Case Study of a Pragmatic Vibrotactile Belt. IEEE Transactions on Instrumentation and Measurement 60, 1: 114--125.
[22]
Thijs Roumen, Simon T. Perrault, and Shengdong Zhao. 2015. NotiRing: A Comparative Study of Notification Channels for Wearable Interactive Rings. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15, 2497--2500.
[23]
Bahador Saket, Chrisnawan Prasojo, Yongfeng Huang, and Shengdong Zhao. 2013. Designing an effective vibration-based notification interface for mobile phones. In Proceedings of the 2013 conference on Computer supported cooperative work, 149--1504.
[24]
Daniel Spelmezan and Jan Borchers. 2009. Tactile Motion Instructions For Physical Activities. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '09, 1--10.
[25]
Mayuree Srikulwong and Eamonn O'Neill. 2011. A comparative study of tactile representation techniques for landmarks on a wearable device. Proceedings of the 2011 annual conference on Human factors in computing systems: 2029--2038.
[26]
Yulin Wang, Barbara Millet, and James L. Smith. 2016. Designing wearable vibrotactile notifications for information communication. International Journal of Human Computer Studies 89: 24--34.
[27]
Gi Hun Yang, Moon Sub Jin, Yeonsub Jin, and Sungchul Kang. 2010. T-mobile: Vibrotactile display pad with spatial and directional information for hand-held device. IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings: 5245--5250.
[28]
Koji Yatani, Nikola Banovic, and Khai Truong. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 415--424.
[29]
Koji Yatani, Darren Gergle, and Khai Truong. 2012. Investigating effects of visual and tactile feedback on spatial coordination in collaborative handheld systems. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work - CSCW '12: 661.
[30]
Koji Yatani and Khai Nhut Truong. 2009. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, 111--120.

Cited By

View all
  • (2024)Enhancing Touch Circular Knob with Haptic Feedback when Performing Another Saturating Attention Primary TaskProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656656(1-9)Online publication date: 3-Jun-2024
  • (2023)The Effect of Attention Saturating Task on Eyes-Free Gesture Production on Mobile DevicesCompanion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces10.1145/3626485.3626535(27-31)Online publication date: 5-Nov-2023
  • (2023)Exploring Recognition Accuracy of Vibrotactile Stimuli in Sternoclavicular AreaProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611372(98-103)Online publication date: 8-Oct-2023
  • Show More Cited By

Index Terms

  1. Effect of temporality, physical activity and cognitive load on spatiotemporal vibrotactile pattern recognition

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AVI '18: Proceedings of the 2018 International Conference on Advanced Visual Interfaces
    May 2018
    430 pages
    ISBN:9781450356169
    DOI:10.1145/3206505
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 May 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cognitive load
    2. physical activity
    3. spatiotemporal vibrotactile pattern
    4. tactile feedback
    5. wearable computing

    Qualifiers

    • Research-article

    Conference

    AVI '18
    AVI '18: 2018 International Conference on Advanced Visual Interfaces
    May 29 - June 1, 2018
    Grosseto, Castiglione della Pescaia, Italy

    Acceptance Rates

    AVI '18 Paper Acceptance Rate 19 of 77 submissions, 25%;
    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)24
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Enhancing Touch Circular Knob with Haptic Feedback when Performing Another Saturating Attention Primary TaskProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656656(1-9)Online publication date: 3-Jun-2024
    • (2023)The Effect of Attention Saturating Task on Eyes-Free Gesture Production on Mobile DevicesCompanion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces10.1145/3626485.3626535(27-31)Online publication date: 5-Nov-2023
    • (2023)Exploring Recognition Accuracy of Vibrotactile Stimuli in Sternoclavicular AreaProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611372(98-103)Online publication date: 8-Oct-2023
    • (2023)HaptiCollar: Investigating Tactile Acuity Towards Vibrotactile Stimuli on the NeckProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3573121(1-7)Online publication date: 26-Feb-2023
    • (2023)Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic FeedbackHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_31(479-500)Online publication date: 25-Aug-2023
    • (2023)Effects of Moving Speed and Phone Location on Eyes-Free Gesture Input with Mobile DevicesHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_30(469-478)Online publication date: 25-Aug-2023
    • (2022)Vibrotactile Spatiotemporal Pattern Recognition in Two-Dimensional Space Around HandIEEE Transactions on Haptics10.1109/TOH.2022.321331315:4(718-728)Online publication date: 1-Oct-2022
    • (2021)Multi-channel Tactile Feedback Based on User Finger SpeedProceedings of the ACM on Human-Computer Interaction10.1145/34885495:ISS(1-17)Online publication date: 5-Nov-2021
    • (2021)Heterogeneous Stroke: Using Unique Vibration Cues to Improve the Wrist-Worn Spatiotemporal Tactile DisplayProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445448(1-12)Online publication date: 6-May-2021
    • (2021)Effect of Attention Saturating and Cognitive Load on Tactile Texture Recognition for Mobile SurfaceHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85610-6_31(557-579)Online publication date: 26-Aug-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media