skip to main content
10.1145/3173574.3174000acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Jetto: Using Lateral Force Feedback for Smartwatch Interactions

Published: 21 April 2018 Publication History

Abstract

Interacting with media and games is a challenging user experience on smartwatches due to their small screens. We propose using lateral force feedback to enhance these experiences. When virtual objects on the smartwatch display visually collide or push the edge of the screen, we add haptic feedback so that the user also feels the impact. This addition creates the illusion of a virtual object that is physically hitting or pushing the smartwatch, from within the device itself. Using this approach, we extend virtual space and scenes into a 2D physical space. To create realistic lateral force feedback, we first examined the minimum change in force magnitude that is detectable by users in different directions and weight levels, finding an average JND of 49% across all tested conditions, with no significant effect of weight and force direction. We then developed a proof-of-concept hardware prototype called Jetto and demonstrated its unique capabilities through a set of impact-enhanced videos and games. Our preliminary user evaluations indicated the concept was welcomed and is regarded as a worthwhile addition to smartwatch output and media experiences.

Supplementary Material

suppl.mov (pn3573-file3.mp4)
Supplemental video
suppl.mov (pn3573-file5.mp4)
Supplemental video
MP4 File (pn3573.mp4)

References

[1]
Apple Watch survey finds most customers plan to buy more bands, third-party interest increasing. 2015. Retrieved August 20, 2017 from https://9to5mac.com/2015/11/03/apple-watch-bandsurvey/
[2]
Flexible SmartWatch display by SmartSound Case Inc. 2013. Retrieved August 20, 2017 from http://www.smartsoundcase.com/portfolioview/smartwatch-flexible-display-panel-for-apple-bysmartsound-case-inc/
[3]
Lenovo flexible Phone. 2016. Retrieved August 15, 2017 from https://www.cnet.com/news/lenovo-showsoff-superflexible-phone-you-wear-like-a-watch/
[4]
Nintendo Switch. 2017. Retrieved August 15, 2017 from https://www.nintendo.com/switch/presentation2017/
[5]
Jessalyn Alvina, Shengdong Zhao, Simon T. Perrault, Maryam Azh, Thijs Roumen and Morten Fjeld. 2015. OmniVib: Towards Cross-body Spatiotemporal Vibrotactile Notifications for Mobile Phones. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), 24872496.
[6]
Tomohiro Amemiya, Hideyuki Ando and Taro Maeda. 2008. Lead-me interface for a pulling sensation from hand-held devices. ACM Transactions on Applied Perception (TAP), 1--17.
[7]
Akash Badshah, Sidhant Gupta, Daniel Morris, Shwetak Patel and Desney Tan. 2012. GyroTab: a handheld device that provides reactive torque feedback. In Proceedings of the 30th Annual ACM Conference on Human Factors in Computing Systems (CHI'12), 31533156.
[8]
Karlin Bark, Jason W. Wheeler, Sunthar Premakumar and Mark R. Cutkosky. 2008. Comparison of Skin Stretch and Vibrotactile Stimulation for Feedback of Proprioceptive Information. In Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS'08), 71--78.
[9]
Olivier Bau, Ivan Poupyrev, Ali Israr and Chris Harrison. 2010. TeslaTouch: electrovibration for touch surfaces. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (UIST'10), 283--292.
[10]
Mohamed Benali-khoudja, Moustapha Hafez, Jean-marc Alex and Abderrahmane Kheddar. 2003. Tactile interfaces: a state-of-the-art survey. Int Symposium on Robotics.
[11]
Jeffrey P. Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C. Miller, Robin Miller, Aubrey Tatarowicz, Brandyn White and Samual White. 2010. VizWiz: nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (UIST'10), 333--342.
[12]
Francesco Chinello, Claudio Pacchierotti, Nikos G. Tsagarakis and Domenico Prattichizzo. 2016. Design of a wearable skin stretch cutaneous device for the upper limb. In IEEE Haptics Symposium (HAPTICS'16), 1420.
[13]
Artem Dementyev, Hsin-Liu Kao, Inrak Choi, Deborah Ajilo, Maggie Xu, Joseph A. Paradiso, Chris Schmandt and Sean Follmer. 2016. Rovables: Miniature On-Body Robots as Mobile Wearables. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST'16), 111--120.
[14]
Panteleimon Dimitriadis and Jason Alexander. 2014. Evaluating the effectiveness of physical shape-change for in-pocket mobile device notifications. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI'14), 25892592.
[15]
Alex Dobie. 2005. First look: Lenovo's crazy Magic View smartwatch concept. Retrieved August 10, 2017 from https://www.androidcentral.com/first-looklenovos-crazy-magic-view-smartwatch-concept
[16]
Jun Gong, Lan Li, Daniel Vogel and Xing-Dong Yang. 2017. Cito: An Actuated Smartwatch for Extended Interactions. In Proceedings of the 35th annual ACM Conference on Human Factors in Computing Systems (CHI'17), 5331--5345.
[17]
Ashley L. Guinan, Nathaniel A. Caswell, Frank A. Drews and William R. Provancher. 2013. A video game controller with skin stretch haptic feedback. In IEEE International Conference on Consumer Electronics (ICCE'13), 456--457.
[18]
Ashley L. Guinan, Markus N. Montandon, Andrew J. Doxon and William R. Provancher. 2014. {D03} kinesthetic physical interaction with a multi-handed tactile display. In IEEE Haptics Symposium (HAPTICS'14), 1--1.
[19]
Liang He, Cheng Xu, Ding Xu and Ryan Brill. 2015. PneuHaptic: delivering haptic cues with a pneumatic armband. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC'15), 47--48.
[20]
Fabian Hemmert. 2008. Ambient Life: Permanent Tactile Life-like Actuation as a Status Display in Mobile Phones. In Adjunct Proc. of the 21st annual ACM symposium on User Interface Software and Technology (UIST'08).
[21]
Fabian Hemmert, Susann Hamann, Matthias Löwe, Anne Wohlauf and Gesche Joost. 2010. Shape-changing mobiles: tapering in one-dimensional deformational displays in mobile phones. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction (TEI'10), 249--252.
[22]
Fabian Hemmert, Matthias Löwe, Anne Wohlauf and Gesche Joost. 2013. Animate mobiles: proxemically reactive posture actuation as a means of relational interaction with mobile phones. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI'13), 267--270.
[23]
Da-Yuan Huang, Ruizhen Guo, Jun Gong, Jingxian Wang, John Graham, De-Nian Yang and Xing-Dong Yang. 2017. RetroShape: Leveraging Rear-Surface Shape Displays for 2.5D Interaction on Smartwatches. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST'17), 539--551.
[24]
Francis Iannacci, Erik Turnquist, Daniel Avrahami, and Shwetak N. Patel. 2011. The haptic laser: multisensation tactile feedback for at-a-distance physical space perception and interaction. In Proceedings of 29th Annual ACM Conference on Human Factors in Computing Systems (CHI '11), 2047--2050.
[25]
Alexandra Ion, Edward Jay Wang and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User's Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), 2501--2504.
[26]
Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems, 234--241.
[27]
Alexander Kron and Gunther Schmidt. 2003. Multifingered tactile feedback from virtual and remote environments. In 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS'03), 16--23.
[28]
Shinobu Kuroki, Hiroyuki Kajimoto, Hideaki Nii, Naoki Kawakami and Susumu Tachi. 2007. Proposal for tactile sense presentation that combines electrical and mechanical stimulus. In Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC'07), 121--126.
[29]
Jaeyeon Lee, Jaehyun Han and Geehyuk Lee. 2015. Investigating the Information Transfer Efficiency of a 3x3 Watch-back Tactile Display. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), 1229--1232.
[30]
Seungyon "Claire" Lee and Thad Starner. 2010. BuzzWear: alert perception in wearable tactile displays on the wrist. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'10), 433--442.
[31]
Seungyon Claire Lee and Thad Starner. 2009. Mobile gesture interaction using wearable tactile displays. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (CHI EA'09), 3437--3442.
[32]
Joanne Leong, Patrick Parzer, Florian Perteneder, Teo Babic, Christian Rendl, Anita Vogl, Hubert Egger, Alex Olwal and Michael Haller. 2016. proCover: Sensory Augmentation of Prosthetic Limbs Using Smart Textile Covers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST'16), 335--346.
[33]
Yi-Chi Liao, Yi-Ling Chen, Jo-Yu Lo, Rong-Hao Liang, Liwei Chan and Bing-Yu Chen. 2016. EdgeVib: Effective Alphanumeric Character Output Using a Wrist-Worn Tactile Display. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST'16), 595--601.
[34]
Yi-Chi Liao, Yi-Ling Chen, Jo-Yu Lo, Rong-Hao Liang, Liwei Chan and Bing-Yu Chen. 2016. EdgeVib: Effective Alphanumeric Character Output Using a Wrist-Worn Tactile Display. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST'16), 595--601.
[35]
Yi-Chi Liao, Shun-Yao Yang, Rong-Hao Liang, Liwei Chan and Bing-Yu Chen. 2015. ThirdHand: wearing a robotic arm to experience rich force feedback. In SIGGRAPH Asia 2015 Emerging Technologies (SA'15), 1--1.
[36]
Jeff Lieberman and Cynthia Breazeal. 2007. TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning. In IEEE Transactions on Robotics, 919--926.
[37]
Pedro Lopes and Patrick Baudisch. 2013. Musclepropelled force feedback: bringing force feedback to mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'13), 2577--2580.
[38]
Pedro Lopes, Alexandra Ion and Patrick Baudisch. 2015. Impacto: Simulating Physical Impact by Combining Tactile Stimulation with Electrical Muscle Stimulation. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (UIST'15), 11--19.
[39]
Kent Lyons, David Nguyen, Daniel Ashbrook and Sean White. 2012. Facet: a multi-segment wrist worn system In Proceedings of the 25th annual ACM symposium on User interface software and technology (UIST'12), 123130.
[40]
Michael Matscheko, Alois Ferscha, Andreas Riener and Manuel Lehner. 2010. Tactor placement in wrist worn wearables. In International Symposium on Wearable Computers (ISWC'10), 1--8.
[41]
Troy McDaniel, Morris Goldberg, Daniel Villanueva, Lakshmie Narayan Viswanathan and Sethuraman Panchanathan. 2011. Motor learning using a kinematicvibrotactile mapping targeting fundamental movements. In Proceedings of the 19th ACM international conference on Multimedia (MM'11), 543--552.
[42]
Troy McDaniel, Sreekar Krishna, Vineeth Balasubramanian, Dirk Colbry and Sethuraman Panchanathan. 2008. Using a haptic belt to convey nonverbal communication cues during social interactions to individuals who are blind. In Haptic Audio visual Environments and Games (HAVE'08), 13--18.
[43]
Ken Nakagaki, Sean Follmer and Hiroshi Ishii. 2015. LineFORM: Actuated Curve Interfaces for Display, Interaction, and Constraint. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (UIST'15), 333--339.
[44]
Akio Nakamura, Sou Tabata, Tomoya Ueda, Shinichiro Kiyofuji and Yoshinori Kuno. 2005. Multimodal presentation method for a dance training system. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA'05), 1685--1688.
[45]
Simon Olberding, Kian Peen Yeo, Suranga Nanayakkara and Jurgen Steimle. 2013. AugmentedForearm: exploring the design space of a display-enhanced forearm. In Proceedings of the 4th Augmented Human International Conference (AH'13), 9--12.
[46]
Stefano Papetti, Federico Fontana, Marco Civolani, Amir Berrezag and Vincent Hayward. 2010. Audiotactile display of ground properties using interactive shoes. In Haptic and Audio Interaction Design (HAID'10). 117--128.
[47]
Jerome Pasquero, Scott J. Stobbe and Noel Stonehouse. 2011. A haptic wristwatch for eyes-free interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'11), 3257--3266.
[48]
Fabrizio Pece, Juan Zarate, Velko Vechev, Nadine Besse, Olexandr Gudozhnik, Herbert Shea and Otmar Hilliges. 2017. MagTics: Flexible and Thin Form Factor Magnetic Actuators for Dynamic and Wearable Haptic Feedback. In Proceedings of the 30th annual ACM symposium on User interface software and technology (UIST'17).
[49]
Esben W Pedersen, Sriram Subramanian and Kasper Hornbæk. 2014. Is my phone alive?: a large-scale study of shape change in handheld devices using videos. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI'14), 25792588.
[50]
Ioannis Politis, Stephen A Brewster and Frank Pollick. 2014. Evaluating multimodal driver displays under varying situational urgency. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI'14), 4067--4076.
[51]
Ioannis Politis, Stephen Brewster and Frank Pollick. 2015. To Beep or Not to Beep?: Comparing Abstract versus Language-Based Multimodal Driver Displays. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), 39713980.
[52]
Helena Pongrac. 2008. Vibrotactile perception: examining the coding of vibrations and the just noticeable difference under various conditions. In Multimedia Systems, 297--307.
[53]
Wang Qi and Vincent Hayward. 2009. Biomechanically Optimized Distributed Tactile Transducer Based on Lateral Skin Deformation. In International Journal of Robotics Research, 323--335.
[54]
Zhan F. Quek, Samuel B. Schorr, Ilana Nisky, William R. Provancher and Allison M. Okamura. 2014. Sensory substitution using 3-degree-of-freedom tangential and normal skin deformation feedback. In IEEE Haptics Symposium (HAPTICS'14), 27--33.
[55]
Jun Rekimoto. 2013. Traxion: a tactile interaction device with virtual force sensation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13), 427--432.
[56]
Joseph M. Romano and Katherine J. Kuchenbecker. 2009. The AirWand: Design and characterization of a large-workspace haptic device. In IEEE International Conference on Robotics and Automation (ICRA'09), 1461--1466.
[57]
Anne Roudaut, Abhijit Karnik, Markus Löchtefeld and Sriram Subramanian. 2013. Morphees: toward high shape resolution in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'13), 593602.
[58]
Thijs Roumen, Simon T Perrault and Shengdong Zhao. 2015. Notiring: A comparative study of notification channels for wearable interactive rings. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), 2497--2500.
[59]
Christian Schonauer, Kenichiro Fukushi, Alex Olwal, Hannes Kaufmann and Ramesh Raskar. 2012. Multimodal motion guidance: techniques for adaptive and dynamic feedback. In Proceedings of the 14th ACM international conference on Multimodal interaction (ICMI'12), 133--140.
[60]
Teddy Seyed, Xing-Dong Yang and Daniel Vogel. 2016. Doppio: A Reconfigurable Dual-Face Smartwatch for Tangible Interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI'16), 4675--4686.
[61]
Alejandro Jarillo Silva, Omar A Domínguez Ramirez, Vicente Parra Vega and Jesus P Ordaz Oliver. 2009. Phantom omni haptic device: Kinematic and manipulability. In Electronics, Robotics and Automotive Mechanics Conference (CERMA'09), 193--198.
[62]
Daniel Spelmezan, Mareike Jacobs, Anke Hilgers and Jan Borchers. 2009. Tactile motion instructions for physical activities. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'09), 2243--2252.
[63]
H. Christiaan Stronks, Daniel J. Parker, Janine Walker, Paulette Lieby and Nick Barnes. 2015. The feasibility of coin motors for use in a vibrotactile display for the blind. Artificial organs, 480--491.
[64]
Koji Tsukada and Michiaki Yasumura. 2004. Activebelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing (UbiComp'04), 384--399.
[65]
SAMSUNG GEAR VR. 2017. Retrieved August 15, 2017 from www.samsung.com/us/explore/gear-vr/.
[66]
Ernst Weber. 1978. De pulsu, resorptione, audita et tactu-annotationes anatomicae et physiologicae (HE Ross, Academic Press.
[67]
Cary Williams, Xing Dong Yang, Grant Partridge, Joshua Millarusiskin, Arkady Major and Pourang Irani. 2011. TZee: exploiting the lighting properties of multitouch tabletops for tangible 3d interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), 1363--1372.
[68]
John Williamson, Roderick Murray-Smith and Stephen Hughes. 2007. Shoogle: excitatory multimodal interaction on mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'07), 121--124.
[69]
Lining Yao, Ryuma Niiyama, Jifei Ou, Sean Follmer, Clark Della Silva and Hiroshi Ishii. 2013. PneUI: Pneumatically actuated soft composite materials for shape changing interfaces. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST'13), 13--22.
[70]
Vibol Yem, Ryuta Okazaki, and Hiroyuki Kajimoto. 2016. FinGAR: combination of electrical and mechanical stimulation for high-fidelity tactile presentation. In ACM SIGGRAPH 2016 Emerging Technologies (SIGGRAPH '16).

Cited By

View all
  • (2024)Shock Me The Way: Directional Electrotactile Feedback under the Smartwatch as a Navigation Aid for CyclistsProceedings of the ACM on Human-Computer Interaction10.1145/36765218:MHCI(1-25)Online publication date: 24-Sep-2024
  • (2024)Expressive, Scalable, Mid-air Haptics with Synthetic JetsACM Transactions on Computer-Human Interaction10.1145/363515031:2(1-28)Online publication date: 29-Jan-2024
  • (2024)AirPush: A Pneumatic Wearable Haptic Device Providing Multi-Dimensional Force Feedback on a FingertipProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642536(1-13)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Jetto: Using Lateral Force Feedback for Smartwatch Interactions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
    April 2018
    8489 pages
    ISBN:9781450356206
    DOI:10.1145/3173574
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 April 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. haptic display
    2. smartwatch
    3. wearable haptics

    Qualifiers

    • Research-article

    Conference

    CHI '18
    Sponsor:

    Acceptance Rates

    CHI '18 Paper Acceptance Rate 666 of 2,590 submissions, 26%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)71
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Shock Me The Way: Directional Electrotactile Feedback under the Smartwatch as a Navigation Aid for CyclistsProceedings of the ACM on Human-Computer Interaction10.1145/36765218:MHCI(1-25)Online publication date: 24-Sep-2024
    • (2024)Expressive, Scalable, Mid-air Haptics with Synthetic JetsACM Transactions on Computer-Human Interaction10.1145/363515031:2(1-28)Online publication date: 29-Jan-2024
    • (2024)AirPush: A Pneumatic Wearable Haptic Device Providing Multi-Dimensional Force Feedback on a FingertipProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642536(1-13)Online publication date: 11-May-2024
    • (2023)transPAF: Rendering Omnidirectional Impact Feedback with Dynamic Point of Application of Force All Round a ControllerProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581092(1-13)Online publication date: 19-Apr-2023
    • (2022)HaptiDragProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503106:3(1-26)Online publication date: 7-Sep-2022
    • (2022)Outpace Reality: A Novel Augmented-Walking Technique for Virtual Reality GamesProceedings of the ACM on Human-Computer Interaction10.1145/35495096:CHI PLAY(1-24)Online publication date: 31-Oct-2022
    • (2022)Feeling Good and In Control: In-game Tools to Support Targets of ToxicityProceedings of the ACM on Human-Computer Interaction10.1145/35494986:CHI PLAY(1-27)Online publication date: 31-Oct-2022
    • (2022)Exploring the Player Experiences of Wearable Gaming Interfaces: A User Elicitation StudyProceedings of the ACM on Human-Computer Interaction10.1145/35494976:CHI PLAY(1-26)Online publication date: 31-Oct-2022
    • (2022)Don't Break my Flow: Effects of Switching Latency in Shooting Video GamesProceedings of the ACM on Human-Computer Interaction10.1145/35494926:CHI PLAY(1-20)Online publication date: 31-Oct-2022
    • (2022)Do Hexad User Types Matter? Effects of (Non-) Personalized Gamification on Task Performance and User Experience in an Image Tagging TaskProceedings of the ACM on Human-Computer Interaction10.1145/35494916:CHI PLAY(1-27)Online publication date: 31-Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media