skip to main content
10.1145/3458709.3458980acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahsConference Proceedingsconference-collections
research-article

Independent Control of Supernumerary Appendages Exploiting Upper Limb Redundancy

Published: 11 July 2021 Publication History

Abstract

In the field of physical augmentation, researchers have attempted to extend human capabilities by expanding the number of human appendages. To fully realize the potential of having an additional appendage, supernumerary appendages should be independently controllable without interfering with the functionality of existing appendages. Herein, we propose a novel approach for controlling supernumerary appendages by exploiting upper limb redundancy. We present a headphone-style visual sensing device and a recognition system to estimate shoulder movement. Through a set of user experiments, we evaluate the feasibility of our system and reveal the potential of independent control using upper limb redundancy. Our results indicate that participants are able to intentionally give commands through their shoulder motions. Finally, we demonstrate the wide range of supernumerary appendage control applications that our novel approach enables and discuss future prospects for our work.

References

[1]
Karan Ahuja, Chris Harrison, Mayank Goel, and Robert Xiao. 2019. MeCap: Whole-Body Digitization for Low-Cost VR/AR Headsets. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. ACM. https://doi.org/10.1145/3332165.3347889
[2]
Mohammed Al-Sada, Thomas Höglund, Mohamed Khamis, Jaryd Urbani, and Tatsuo Nakajima. 2019. Orochi. In Proceedings of the 10th Augmented Human International Conference 2019 on - AH2019. ACM Press. https://doi.org/10.1145/3311823.3311850
[3]
Toshiyuki Ando, Yuki Kubo, Buntarou Shizuki, and Shin Takahashi. 2017. CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology - UIST '17. ACM. https://doi.org/10.1145/3126594.3126649
[4]
Riku Arakawa, Azumi Maekawa, Zendai Kashino, and Masahiko Inami. 2020. Hand with Sensing Sphere: Body-Centered Spatial Interactions with a Hand-Worn Spherical Camera. In Symposium on Spatial User Interaction. ACM. https://doi.org/10.1145/3385959.3418450
[5]
P.K. Artemiadis and K.J. Kyriakopoulos. 2010. EMG-Based Control of a Robot Arm Using Low-Dimensional Embeddings. IEEE Transactions on Robotics 26, 2 (April 2010), 393–398. https://doi.org/10.1109/tro.2009.2039378
[6]
Abdelkareem Bedri, David Byrd, Peter Presti, Himanshu Sahni, Zehua Gue, and Thad Starner. 2015. Stick it in your ear: building an in-ear jaw movement sensor. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers - UbiComp '15. ACM Press. https://doi.org/10.1145/2800835.2807933
[7]
S. Bitzer and P. van der Smagt. 2006. Learning EMG control of a robotic hand: towards active prostheses. In Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.IEEE. https://doi.org/10.1109/robot.2006.1642128
[8]
Liwei Chan, Yi-Ling Chen, Chi-Hao Hsieh, Rong-Hao Liang, and Bing-Yu Chen. 2015. CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15. ACM Press. https://doi.org/10.1145/2807442.2807450
[9]
Liwei Chan, Chi-Hao Hsieh, Yi-Ling Chen, Shuo Yang, Da-Yuan Huang, Rong-Hao Liang, and Bing-Yu Chen. 2015. Cyclops: Wearable and Single-PieceFull-Body Gesture Input Devices. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15. ACM Press. https://doi.org/10.1145/2702123.2702464
[10]
Tuochao Chen, Benjamin Steeper, Kinan Alsheikh, Songyun Tao, François Guimbretière, and Cheng Zhang. 2020. C-Face: Continuously Reconstructing Facial Expressions by Deep Learning Contours of the Face with Ear-mounted Miniature Cameras. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. ACM. https://doi.org/10.1145/3379337.3415879
[11]
Artem Dementyev and Joseph A. Paradiso. 2014. WristFlex: low-power gesture input with wrist-worn pressure sensors. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology - UIST '14. ACM Press. https://doi.org/10.1145/2642918.2647396
[12]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. ImageNet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition. IEEE. https://doi.org/10.1109/cvpr.2009.5206848
[13]
Travis Deyle, Szabolcs Palinko, Erika Shehan Poole, and Thad Starner. 2007. Hambone: A Bio-Acoustic Gesture Interface. In 2007 11th Annual International Symposium on Wearable Computers - ISWC '07. IEEE. https://doi.org/10.1109/iswc.2007.4373768
[14]
L. Dipietro, A.M. Sabatini, and P. Dario. 2008. A Survey of Glove-Based Systems and Their Applications. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 38, 4 (July 2008), 461–482. https://doi.org/10.1109/tsmcc.2008.923862
[15]
Rui Fukui, Masahiko Watanabe, Tomoaki Gyota, Masamichi Shimosaka, and Tomomasa Sato. 2011. Hand shape classification with a wrist contour sensor: development of a prototype device. In Proceedings of the 13th International Conference on Ubiquitous Computing - UbiComp '11. ACM Press. https://doi.org/10.1145/2030112.2030154
[16]
Masaaki Fukuoka, Adrien Verhulst, Fumihiko Nakamura, Ryo Takizawa, Katsutoshi Masai, and Maki Sugimoto. 2019. FaceDrive: Facial Expression Driven Operation to Control Virtual Supernumerary Robotic Arms. ICAT-EGVE 2019 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments (2019). https://doi.org/10.2312/EGVE.20191275
[17]
Jacob Guggenheim, Rachel Hoffman, Hanjun Song, and H. Harry Asada. 2020. Leveraging the Human Operator in the Design and Control of Supernumerary Robotic Limbs. IEEE Robotics and Automation Letters 5, 2 (April 2020), 2177–2184. https://doi.org/10.1109/lra.2020.2970948
[18]
Jacob William Guggenheim and H. Harry Asada. 2020. Inherent Haptic Feedback from Supernumerary Robotic Limbs. IEEE Transactions on Haptics(2020), 1–1. https://doi.org/10.1109/toh.2020.3017548
[19]
Chris Harrison, Hrvoje Benko, and Andrew D. Wilson. 2011. OmniTouch: wearable multitouch interaction everywhere. In Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11. ACM Press. https://doi.org/10.1145/2047196.2047255
[20]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep Residual Learning for Image Recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE. https://doi.org/10.1109/cvpr.2016.90
[21]
Ludovic Hoyet, Ferran Argelaguet, Corentin Nicole, and Anatole Lécuyer. 2016. “Wow! I Have Six Fingers!”: Would You Accept Structural Changes of Your Hand in VR?Frontiers in Robotics and AI 3 (May 2016). https://doi.org/10.3389/frobt.2016.00027
[22]
Irfan Hussain, Leonardo Meli, Claudio Pacchierotti, Gionata Salvietti, and Domenico Prattichizzo. 2015. Vibrotactile haptic feedback for intuitive control of robotic extra fingers. In 2015 IEEE World Haptics Conference (WHC). IEEE. https://doi.org/10.1109/whc.2015.7177744
[23]
Yukiko Iwasaki and Hiroyasu Iwata. 2018. A face vector - the point instruction-type interface for manipulation of an extended body in dual-task situations. In 2018 IEEE International Conference on Cyborg and Bionic Systems (CBS). IEEE. https://doi.org/10.1109/cbs.2018.8612275
[24]
J. Lenarcic and A. Umek. 1994. Simple model of human arm reachable workspace. IEEE Transactions on Systems, Man, and Cybernetics 24, 8(1994), 1239–1246. https://doi.org/10.1109/21.299704
[25]
Azumi Maekawa, Kei Kawamura, and Masahiko Inami. 2020. Dynamic Assistance for Human Balancing with Inertia of a Wearable Robotic Appendage. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2020).
[26]
Hiroyuki Manabe, Masaaki Fukumoto, and Tohru Yagi. 2013. Conductive rubber electrodes for earphone-based eye gesture input interface. In Proceedings of the 17th Annual International Symposium on Wearable Computers - ISWC '13. ACM Press. https://doi.org/10.1145/2493988.2494329
[27]
Fabrice Matulic, Riku Arakawa, Brian Vogel, and Daniel Vogel. 2020. PenSight: Enhanced Interaction with a Pen-Top Camera. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems - CHI '20. ACM. https://doi.org/10.1145/3313831.3376147
[28]
Jess McIntosh, Asier Marzo, and Mike Fraser. 2017. SensIR: Detecting Hand Gestures with a Wearable Bracelet using Infrared Transmission and Reflection. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology - UIST '17. ACM Press. https://doi.org/10.1145/3126594.3126604
[29]
Jess McIntosh, Charlie McNeill, Mike Fraser, Frederic Kerber, Markus Löchtefeld, and Antonio Krüger. 2016. EMPress: Practical Hand Gesture Classification with Wrist-Mounted EMG and Pressure Sensing. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM. https://doi.org/10.1145/2858036.2858093
[30]
Junichi Nabeshima, MHD Yamen Saraiji, and Kouta Minamizawa. 2019. Arque. In ACM SIGGRAPH 2019 Posters. ACM. https://doi.org/10.1145/3306214.3338573
[31]
Koki Nakabayashi, Yukiko Iwasaki, and Hiroyasu Iwata. 2017. Development of Evaluation Indexes for Human-Centered Design of a Wearable Robot Arm. In Proceedings of the 5th International Conference on Human Agent Interaction. ACM. https://doi.org/10.1145/3125739.3125763
[32]
Ken Nakagaki, Sean Follmer, and Hiroshi Ishii. 2015. LineFORM: Actuated Curve Interfaces for Display, Interaction, and Constraint. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15. ACM Press. https://doi.org/10.1145/2807442.2807452
[33]
Federico Parietti and Harry H. Asada. 2013. Dynamic analysis and state estimation for wearable robotic limbs subject to human-induced disturbances. In 2013 IEEE International Conference on Robotics and Automation. IEEE. https://doi.org/10.1109/icra.2013.6631123
[34]
Federico Parietti and H. Harry Asada. 2017. Independent, voluntary control of extra robotic limbs. In 2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE. https://doi.org/10.1109/icra.2017.7989702
[35]
Domenico Prattichizzo, Monica Malvezzi, Irfan Hussain, and Gionata Salvietti. 2014. The Sixth-Finger: A modular extra-finger to enhance human hand capabilities. In The 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE. https://doi.org/10.1109/roman.2014.6926382
[36]
J. Rekimoto. 2001. GestureWrist and GesturePad: unobtrusive wearable interaction devices. In Proceedings Fifth International Symposium on Wearable Computers. IEEE Comput. Soc. https://doi.org/10.1109/iswc.2001.962092
[37]
T. Scott Saponas, Desney S. Tan, Dan Morris, Ravin Balakrishnan, Jim Turner, and James A. Landay. 2009. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology - UIST '09. ACM Press. https://doi.org/10.1145/1622176.1622208
[38]
Tomoya Sasaki, MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, and Masahiko Inami. 2017. MetaLimbs. In ACM SIGGRAPH 2017 Emerging Technologies. ACM. https://doi.org/10.1145/3084822.3084837
[39]
Takaaki Shiratori, Hyun Soo Park, Leonid Sigal, Yaser Sheikh, and Jessica K. Hodgins. 2011. Motion capture from body-mounted cameras. In ACM SIGGRAPH 2011 papers on - SIGGRAPH '11. ACM Press. https://doi.org/10.1145/1964921.1964926
[40]
Vighnesh Vatsal and Guy Hoffman. 2017. Wearing your arm on your sleeve: Studying usage contexts for a wearable robotic forearm. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE. https://doi.org/10.1109/roman.2017.8172421
[41]
Sang won Leigh, Harshit Agrawal, and Pattie Maes. 2018. Robotic Symbionts: Interweaving Human and Machine Actions. IEEE Pervasive Computing 17, 2 (2018), 34–43. https://doi.org/10.1109/mprv.2018.022511241
[42]
Faye Wu and Harry Asada. 2014. Bio-Artificial Synergies for Grasp Posture Control of Supernumerary Robotic Fingers. In Robotics: Science and Systems X. Robotics: Science and Systems Foundation. https://doi.org/10.15607/rss.2014.x.027
[43]
Yang Zhang and Chris Harrison. 2015. Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology - UIST '15. ACM Press. https://doi.org/10.1145/2807442.2807480

Cited By

View all
  • (2025)Shared Control of Supernumerary Robotic Limbs Using Mixed Realityand Mouth-and-Tongue InterfacesBiosensors10.3390/bios1502007015:2(70)Online publication date: 23-Jan-2025
  • (2024)SplitBody: Reducing Mental Workload while Multitasking via Muscle StimulationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642629(1-11)Online publication date: 11-May-2024
  • (2023)MI-PoserProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108917:3(1-24)Online publication date: 27-Sep-2023
  • Show More Cited By

Index Terms

  1. Independent Control of Supernumerary Appendages Exploiting Upper Limb Redundancy
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AHs '21: Proceedings of the Augmented Humans International Conference 2021
    February 2021
    321 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 July 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Human Body Redundancy
    2. Independent Control
    3. Supernumerary Appendages
    4. Supernumerary Robotic Limbs
    5. Wearable Sensing

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AHs '21
    AHs '21: Augmented Humans International Conference 2021
    February 22 - 24, 2021
    Rovaniemi, Finland

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)56
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Shared Control of Supernumerary Robotic Limbs Using Mixed Realityand Mouth-and-Tongue InterfacesBiosensors10.3390/bios1502007015:2(70)Online publication date: 23-Jan-2025
    • (2024)SplitBody: Reducing Mental Workload while Multitasking via Muscle StimulationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642629(1-11)Online publication date: 11-May-2024
    • (2023)MI-PoserProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108917:3(1-24)Online publication date: 27-Sep-2023
    • (2021)Introduction to JIZAI BODY and Our Future Visions自在化身体とその展望Journal of the Robotics Society of Japan10.7210/jrsj.39.68539:8(685-692)Online publication date: 2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media