ABSTRACT
Computer vision and inertial measurement have made it possible for people to interact with computers using whole-body gestures. Although there has been rapid growth in the uses and applications of these systems, their ubiquity has been limited by the high cost of heavily instrumenting either the environment or the user. In this paper, we use the human body as an antenna for sensing whole-body gestures. Such an approach requires no instrumentation to the environment, and only minimal instrumentation to the user, and thus enables truly mobile applications. We show robust gesture recognition with an average accuracy of 93% across 12 whole-body gestures, and promising results for robust location classification within a building. In addition, we demonstrate a real-time interactive system which allows a user to interact with a computer using whole-body gestures
Supplemental Material
- Agrawal, S., Constandache, I., Gaonkar, S., Choudhury, R., Caves, K, DeRuyter, F. Using mobile phones to write in air. In Proc of Mobisys 2011, 15--28. Google ScholarDigital Library
- Chen, W. T. and Chuang, H. R. Numerical Computation of the EM Coupling between a Circular Loop Antenna and a Full-Scale Human-Body Model. IEEE Trans. Microw. Theory Tech., 46.10 (1998), 1516--1520.Google ScholarCross Ref
- Cohn, G., Morris, D., Patel, S. N., Tan, D. S. Your Noise is My Command: Sensing Gestures Using the Body as an Antenna. In Proc of ACM CHI 2011, 791--800. Google ScholarDigital Library
- Gupta, S., Reynolds, M. S., Patel, S. N. ElectriSense: Single-Point Sensing Using EMI for Electrical Event Detection and Classification in the Home. In Proc of ACM Ubicomp 2010, 139--148. Google ScholarDigital Library
- Hall, P. S. and Hoa, Y. Antennas and Propagation for Body Centric Communications. Proc Euro Conf on Antennas and Propagation 2006, 1--7.Google ScholarCross Ref
- Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I. H. The WEKA Data Mining Software: An Update. SIGKDD Explorations, 11.1 (2009). Google ScholarDigital Library
- Harrison, C., Tan, D. Morris, D. Skinput: Appropriating the Body as an Input Surface. In Proc CHI 2010. Google ScholarDigital Library
- Junker, H., Lukowitz, P., Troester, G. Continuous recognition of arm activities with body-worn inertial sensors. In Proc of ISWC 2004, 188--189. Google ScholarDigital Library
- Larson, E., Cohn, G., Gupta, S., Ren, X., Harrison, B., Fox, D., Patel, S. N. HeatWave: thermal imaging for surface user interaction. In Proc of CHI 2011, 2565--2574. Google ScholarDigital Library
- Michoud, B., Guillou, E., Bouakaz, S. Real-time and markerless 3D human motion capture using multiple views. In Proc of Human Motion 2007, 88--103. Google ScholarDigital Library
- Rekimoto, J. GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. In Proc of ISWC 2001. Google ScholarDigital Library
- Saponas, S., Tan, D., Morris, D., Balakrishnan, R., Turner, J., Landay, J. Enabling Always-Available Input with Muscle-Computer Interfaces. In Proc UIST 2009. Google ScholarDigital Library
- Scott, J., Dearman, D., Yatani, K., Truong, K. N. Sensing foot gestures from the pocket. In Proc of UIST 2010. Google ScholarDigital Library
- Vicon. http://www.vicon.comGoogle Scholar
- Wachs, J., Kölsch, M., Stern, H., Edan Y. Vision-based hand-gesture applications. In Comm. ACM 54.2, 60--71. Google ScholarDigital Library
- Xbox Kinect. http://www.xbox.com/en-US/kinect.Google Scholar
Index Terms
Humantenna: using the body as an antenna for real-time whole-body interaction
Recommendations
Your noise is my command: sensing gestures using the body as an antenna
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsTouch sensing and computer vision have made human-computer interaction possible in environments where keyboards, mice, or other handheld implements are not available or desirable. However, the high cost of instrumenting environments limits the ubiquity ...
FilterJoint: Toward an Understanding of Whole-Body Gesture Articulation
ICMI '20: Proceedings of the 2020 International Conference on Multimodal InteractionClassification accuracy of whole-body gestures can be improved by selecting gestures that have few conflicts (i.e., confusions or misclassifications). To identify such gestures, an understanding of the nuances of how users articulate whole-body gestures ...
The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsWe introduce the dissimilarity-consensus method, a new approach to computing objective measures of consensus between users' gesture preferences to support data analysis in end-user gesture elicitation studies. Our method models and quantifies the ...
Comments