ABSTRACT
We propose a new way of eyes-free interaction for wearables. It is based on the user's proprioceptive sense, i.e., rather than seeing, hearing, or feeling an outside stimulus, users feel the pose of their own body. We have implemented a wearable device called Pose-IO that offers input and output based on proprioception. Users communicate with Pose-IO through the pose of their wrists. Users enter information by performing an input gesture by flexing their wrist, which the device senses using a 3-axis accelerometer. Users receive output from Pose-IO by find-ing their wrist posed in an output gesture, which Pose-IO actuates using electrical muscle stimulation. This mechanism allows users to interact with Pose-IO without visual or auditory senses, but through the proprioceptive sense alone. We developed three simple applications that demonstrate symmetric proprioceptive interaction, where input and output occur through the same limb, as well as asymmetric interaction, where input and output occur through different limbs. In a first user study, participants using a symmetric proprioceptive interface re-entered poses received from Pose-IO with an average accuracy of 5.8° despite the minimal bandwidth offered by the device. In a second, exploratory study, we investigated participants' emotional response to asymmetric proprioceptive interaction and the concept of the user's body serving as interface. Participants reported to enjoy the experience (4.6 out of 5).
Supplemental Material
- Ashbrook, D. Enabling Mobile Microinteractions. Ph.D. Dissertation. Georgia Tech, Atlanta, GA, USA. Google ScholarDigital Library
- Ban, Y., Narumi, T., Fujii, T., Sakurai, S., Imura, J., Tanikawa, T., and Hirose, M Augmented endurance: controlling fatigue while handling objects by affecting weight perception using augmented reality. Proc. CHI '13, 69--78. Google ScholarDigital Library
- Baudisch, P., Pohl, H., Reinicke, S., Wittmers, E., Lühne, P., Knaust, M., Köhler, S., Schmidt, P., and Holz, C. Imaginary reality gaming: ball games without a ball. Proc. UIST'13, 405--410. Google ScholarDigital Library
- Boven, K., Fejtl, M., Moeller, A., Nisch, W. and Stett, A. On Micro-Electrode Array Revival. In: Advances in Network Electrophysiology Using Multi-Electrode Arrays, 2006: 24--37.Google Scholar
- Chao, E., An, K., Cooney, W., Linscheid, R. Biomechanics of the Hand. World Scientific Publishing, 1989.Google ScholarCross Ref
- Daniel, S., Gallagher, S., (Eds.), Handbook of Phenomenology and Cognitive Science, Springer; 2010 edition, ISBN-10: 9048126452Google Scholar
- E. Gardner and J. Martin. Coding of Sensory Information, Principles of Neural Science. McGrawHill, fourth edition, 2000.Google Scholar
- Farbiz, F., Yu, Z. H., Manders, C., and Ahmad, W. An electrical muscle stimulation haptic feedback for mixed reality tennis game. Proc. SIGGRAPH '07 (poster). Google ScholarDigital Library
- Felzer, T., and Nordmann, R. Using intentional muscle contractions as input signals for various hands-free control applications. Proc. iCREATe'08, 87--91. Google ScholarDigital Library
- Gandevia, S., Smith, J., Crawford, M., Proske, U., and Taylor, J. Motor commands contribute to human position sense. Journal of Physiology, 2006, 571(3) 703--710.Google ScholarCross Ref
- Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. Proc. UIST '10, 3--12. Google ScholarDigital Library
- Gustafson, S., Rabe, B., and Baudisch, P. Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing. Proc CHI'13, 889--898. Google ScholarDigital Library
- Harrison, C., Tan, D., and Morris, D.. Skinput: appropriating the body as an input surface. Proc. CHI'10, 453--462. Google ScholarDigital Library
- Hollerbach J. and Jacobsen S. Haptic Interfaces for Teleoperation and Virtual Environments. Proc. of First Workshop on Simulation and Interaction in Virtual Environments '95, 13--15.Google Scholar
- Hook, J., Nappey, T., Making 3D Printed Objects Interactive Using Wireless Accelerometers. Proc. CHI '14 Extended Abstracts, 1435--1440. Google ScholarDigital Library
- K. Bark, J. W. Wheeler, S. Premakumar, and M. R. Cutkosky, Comparison of Skin Stretch and Vibrotactile Stimulation for Feedback of Proprioceptive Information. Proc. HAPTICS '08, 71--78. Google ScholarDigital Library
- Karuei, I., MacLean, K., Foley-Fisher, Z., MacKenzie, R., Koch, S., and El-Zohairy, M. Detecting vibrations across the body in mobile contexts. Proc. CHI'11, 3267--3276. Google ScholarDigital Library
- Katoh, M, Nishimura, N., Yokoyama, M., Hachisu, T., Sato, M., FUnited Kingdomushima, S., and Kajimoto, H. Optimal selection of electrodes for muscle electrical stimulation using twitching motion measurement. Proc. AH '13, 237--238. Google ScholarDigital Library
- Ken Hinckley, Randy Pausch, and Dennis Proffitt. 1997. Attention and visual feedback: the bimanual frame of reference. Proc. I3D '97, 121--126. Google ScholarDigital Library
- Kruijff, E., Schmalstieg, D., and Beckhaus, S. Using neuromuscular electrical stimulation for pseudo-haptic feedback. Proc. VRST '06, 316--319. Google ScholarDigital Library
- Li, K, A. Designing easily learnable eyes-free interaction, Ph.D. Thesis. UC San Diego, 2009.Google Scholar
- Lopes, P. and Baudisch, P., Muscle-Propelled Force Feedback: bringing force feedback to mobile devices, Proc. CHI'13, 2577--2580. Google ScholarDigital Library
- Maffiuletti, N., Minetto, M., Farina, D., and Bottinelli, R. Electrical stimulation for neuromuscular testing and training: State-of-the art and unresolved issues. In Journal of Applied Physiology 2011, 111:2391--2397.Google Scholar
- Murayama, J., Bougrila, L., Luo, Y., Akahane, K., Hasegawa, S., Hirsbrunner, B., Sato, M. SPIDAR G&G: a two-handed haptic interface for bimanual VR interaction. Proc. EuroHaptics '04, 138--146.Google Scholar
- Ni, T. and Baudisch, P. Disappearing mobile devices. Proc. UIST'09, 101--110. Google ScholarDigital Library
- Pasquero, J., Stobbe, S., and Stonehouse, N. A haptic wristwatch for eyes-free interactions. Proc. CHI'11, 3257--3266. Google ScholarDigital Library
- Pfeiffer, M., Schneegass, S., Alt, F., and Rohs, M., Let Me Grab This: A Comparison of EMS and Vibration for Haptic Feedback in Free-Hand Interaction, Proc. AH'14. Google ScholarDigital Library
- Proske, U., and Gandevia, S., The Proprioceptive Senses: Their Roles in Signaling Body Shape, Body Position and Movement, and Muscle Force, In Physiological Reviews 2012, Vol. 92 no. 4,1651--1697Google Scholar
- Red Hands, http://en.wikipedia.org/wiki/Red_hands.Google Scholar
- Roudaut, A., Rau, A., Sterz, C., Plauth, M., Lopes, P., and Baudisch, P. Gesture output: eyes-free output using a force feedback touch surface. Proc. CHI'13, 25472556. Google ScholarDigital Library
- Saponas, S., Tan, D., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. Enabling always-available input with muscle-computer interfaces. Proc. UIST'09, 167--176. Google ScholarDigital Library
- Stelarc. From zombies to cyborg bodies: exoskeleton, extra ear and avatars. Proc. C&C '99, 23. Google ScholarDigital Library
- Strojnik, P., Kralj, A., and Ursic, I., Programmed sixchannel electrical stimulator for complex stimulation of leg muscles during walking, IEEE Trans. Biomed. Eng. 26, 112, 1979.Google ScholarCross Ref
- Tamaki, E., Miyaki, T., and Rekimoto, J., Possessed Hand: techniques for controlling human hands using electrical muscles stimuli. Proc. CHI '11, 543--552. Google ScholarDigital Library
- Terrault, S., Lecolinet, E., Eagan, J., and Guiard, Y. Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets. Proc. CHI'13, 1451--1460. Google ScholarDigital Library
- TsetserUnited Kingdomou, D., Sato, K., and Tachi, S. ExoInterfaces: novel exosceleton haptic interfaces for virtual reality, augmented sport and rehabilitation. Proc. AH '10, 1--6. Google ScholarDigital Library
- Walsh, L., Smith, J., Gandevia, S, and Taylor, J. The combined effect of muscle contraction history and motor commands on human position sense. Journal of Experimental Brain Research, 2009; 195(4): 603--10.Google Scholar
- Wigdor, D., and Wixon, D, Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Morgan Kaufmann Publishers Inc., 2011. Google ScholarDigital Library
Index Terms
- Proprioceptive Interaction
Recommendations
Proprioceptive Interaction
CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing SystemsWe propose a new way of eyes-free interaction for wearables. It is based on the user's proprioceptive sense, i.e., rather than seeing, hearing, or feeling an outside stimulus, users feel the pose of their own body. We have implemented a wearable device ...
Proprioceptive Interaction: The User's Muscles as Input and Output Device
CHI EA '16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing SystemsIn my research, I investigate how users might interact with devices smaller than mobile or wearable devices. I argue that to achieve the intended minimal form-factor such devices will leverage the user's body as an input and output device. Users will ...
A Sitting Balance Training Robot for Trunk Rehabilitation of Acute and Subacute Stroke Patients
ICBBS '17: Proceedings of the 6th International Conference on Bioinformatics and Biomedical ScienceObjective: to test the stability of a sitting balance training robot and to verify the physiological effects on trunk muscles during the robot training.
Method: fundamental experiment and robot test experiment were conducted. EMG and kinematic data were ...
Comments