ABSTRACT
Recently, guessability studies have become a popular means among researchers to elicit user-defined interaction sets involving gesture, speech and multimodal input. However, tool support for capturing and analysing interaction proposals is lacking and the method itself is still evolving. This paper presents Kinect Analysis---a system designed for interaction elicitation studies with support for record-and-replay, visualisation and analysis based on Kinect's depth, audio and video streams. Kinect Analysis enables post-hoc analysis during playback and live analysis with real-time feedback while recording. In particular, new visualisations such as skeletal joint traces and heatmaps can be superimposed for analysis and comparison of multiple recordings. It also introduces KinectScript---a simple scripting language to query recordings and automate analysis tasks based on skeleton, distance, audio and gesture scripts. The paper discusses Kinect Analysis both as a tool and a method that could enable researchers to more easily collect, study and share interaction proposals. Using data from a previous guessability study with 25 users, we show that Kinect Analysis in combination with KinectScript is useful and effective for a range of analysis tasks.
- Ashbrook, D., and Starner, T. MAGIC: A Motion Gesture Design Tool. In Proc. CHI (2010). Google ScholarDigital Library
- Burr, B. VACA: a tool for qualitative video analysis. In Proc. CHI EA (2006). Google ScholarDigital Library
- Fouse, A., Weibel, N., Hutchins, E., and Hollan, J. D. ChronoViz: A System for Supporting Navigation of Time-coded Data. In Proc. CHI EA (2011). Google ScholarDigital Library
- Heath, C., Hindmarsh, J., and Luff, P. Video in Qualitative Research. SAGE, 2010.Google Scholar
- Hinrichs, U., and Carpendale, S. Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. In Proc. CHI (2011). Google ScholarDigital Library
- Hoste, L., and Signer, B. SpeeG2: a speech- and gesture-based interface for efficient controller-free text input. In Proc. ICMI (2013). Google ScholarDigital Library
- Jang, S., Elmqvist, N., and Ramani, K. GestureAnalyzer: Visual Analytics for Pattern Analysis of Mid-Air Hand Gestures. In Proc. SUI (2014). Google ScholarDigital Library
- Jones, B. R., Benko, H., Ofek, E., and Wilson, A. D. Illumiroom: peripheral projected illusions for interactive experiences. In Proc. CHI (2013). Google ScholarDigital Library
- Kato, J., McDirmid, S., and Cao, X. DejaVu: Integrated Support for Developing Interactive Camera-Based Programs. In Proc. UIST (2012). Google ScholarDigital Library
- Klemmer, S. R., Hartmann, B., and Takayama, L. How Bodies Matter: Five Themes for Interaction Design. In Proc. DIS (2006). Google ScholarDigital Library
- Lee, S.-S., Chae, J., Kim, H., Lim, Y.-K., and Lee, K.-P. Towards more Natural Digital Content Manipulation via User Freehand Gestural Interaction in a Living Room. In Proc. UbiComp (2013). Google ScholarDigital Library
- Morris, M. R. Web on the Wall: Insights from a Multimodal Interaction Elicitation Study. In Proc. ITS (2012). Google ScholarDigital Library
- Morris, M. R., Danielescu, A., Drucker, S. M., Fisher, D., Lee, B., m. c. schraefel, and Wobbrock, J. O. Reducing Legacy Bias in Gesture Elicitation Studies. Interactions 21, 3 (2014). Google ScholarDigital Library
- Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding Users Preferences for Surface Gestures. In Proc. GI (2010). Google ScholarDigital Library
- Nacenta, M. A., Kamber, Y., Qiang, Y., and Kristensson, P. O. Memorability of Pre-designed and User-defined Gesture Sets. In Proc. CHI (2013). Google ScholarDigital Library
- Nebeling, M., Huber, A., Ott, D., and Norrie, M. C. Web on the Wall Reloaded: Implementation, Replication and Refinement of User-Defined Interaction Sets. In Proc. ITS (2014). Google ScholarDigital Library
- Nebeling, M., Speicher, M., and Norrie, M. C. W3Touch: Metrics-based Web Page Adaptation for Touch. In Proc. CHI (2013). Google ScholarDigital Library
- Nebeling, M., Teunissen, E., Husmann, M., and Norrie, M. C. XDKinect: Development Framework for Cross-Device Interaction using Kinect. In Proc. EICS (2014). Google ScholarDigital Library
- North, C., Dwyer, T., Lee, B., Fisher, D., Isenberg, P., Robertson, G. G., and Inkpen, K. Understanding Multi-touch Manipulation for Surface Computing. In Proc. INTERACT (2009). Google ScholarDigital Library
- Oh, U., and Findlater, L. The Challenges and Potential of End-User Gesture Customization. In Proc. CHI (2013). Google ScholarDigital Library
- Ruiz, J., Li, Y., and Lank, E. User-Defined Motion Gestures for Mobile Interaction. In Proc. CHI (2011). Google ScholarDigital Library
- Schmidt, D., Seifert, J., Rukzio, E., and Gellersen, H. A Cross-Device Interaction Style for Mobiles and Surfaces. In Proc. DIS (2012). Google ScholarDigital Library
- Seyed, T., Burns, C., Sousa, M. C., Maurer, F., and Tang, A. Eliciting Usable Gestures for Multi-Display Environments. In Proc. ITS (2012). Google ScholarDigital Library
- Troiano, G. M., Pedersen, E. W., and Hornbæk, K. User-Defined Gestures for Elastic, Deformable Displays. In Proc. AVI (2014). Google ScholarDigital Library
- Vatavu, R. User-Defined Gestures for Free-Hand TV Control. In Proc. EuroITV (2012). Google ScholarDigital Library
- Weibel, N., Emmenegger, C., Lyons, J., Dixit, R., Hill, L. L., and Hollan, J. D. Interpreter-Mediated Physician-Patient Communication: Opportunities for Multimodal Healthcare Interfaces. In Proc. PervasiveHealth (2013). Google ScholarDigital Library
- Weichel, C., Lau, M., Kim, D., Villar, N., and Gellersen, H. MixFab: A Mixed-Reality Environment for Personal Fabrication. In Proc. CHI (2014). Google ScholarDigital Library
- Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-Defined Gestures for Surface Computing. In Proc. CHI (2009). Google ScholarDigital Library
Index Terms
- Kinect analysis: a system for recording, analysing and sharing multimodal interaction elicitation studies
Recommendations
Enabling Finger-Gesture Interaction with Kinect
VINCI '15: Proceedings of the 8th International Symposium on Visual Information Communication and InteractionA large number of tracking and gesture recognition algorithms and technologies have been developed in the field of human-computer interactions thanks to the introduction of cameras with depth sensors such as Microsoft's Kinect. Most of the techniques ...
Multi-scenario gesture recognition using Kinect
CGAMES '12: Proceedings of the 2012 17th International Conference on Computer Games: AI, Animation, Mobile, Interactive Multimedia, Educational & Serious Games (CGAMES)Hand gesture recognition (HGR) is an important research topic because some situations require silent communication with sign languages. Computational HGR systems assist silent communication, and help people learn a sign language. In this article, a ...
Kinect-based Taiwanese sign-language recognition system
Gesture-recognition is an important component for many intelligent human---computer interaction applications. For example, a realtime sign-language recognition system would detect and interpret hand gestures. Many vision-based sign-language recognition ...
Comments