skip to main content
10.1145/2525194.2525206acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Interacting with a self-portrait camera using motion-based hand gestures

Published: 24 September 2013 Publication History

Abstract

Taking self-portraits with a digital camera is a popular way to present oneself through photography. Traditional techniques for taking self-portraits, such as use of self-timers or face detection, provide only a modest degree of interaction between the user and camera. In this paper, we present an interaction technique that make novel use of image-processing algorithm to recognize hand motion gestures and provides user a natural way to interact with camera for taking self-portraits. User can perform nature gestures to control essential functions of camera and take self-portraits effectively. Three types of gesture (i.e., waving, eight-direction selection, and circling were identified and applied to develop a gesture user interface for controlling a Digital Single-Lens Reflex (DSLR) camera. Two experiments were conducted to evaluate the usability and performance of the gesture interface. The results confirmed that the usability of the gesture interface is superior to a self-timer and the proposed technique achieved about 80% accurate recognition of motion gestures.

References

[1]
Huang, L., Xia, T., Wan, J., Zhang, Y., and Lin, S. Personalized portraits ranking. In Proceedings of the 19th ACM international conference on Multimedia, ACM Press (2011), 1277--1280.
[2]
Okabe, D., Ito, M., Chipchase, J., and Shimizu, A. The social uses of purikura: photographing, modding, archiving, and sharing. In Pervasive Image Capture and Sharing Workshop, Ubiquitous Computing Conference (2006), 2--5.
[3]
4 tips for taking gorgeous self-portrait and outfit photos. http://www.shrimpsaladcircus.com/2012/01/self-portrait-outfit-photography-guide.html.
[4]
Taking a great self portrait with your camera. http://www.squidoo.com/self-portrait-tips.
[5]
Adams, A., Talvala, E.-V., Park, S. H., Jacobs, D. E., Ajdin, B., Gelfand, N., Dolson, J., Vaquero, D., Baek, J., Tico, M., Lensch, H. P. A., Matusik, W., Pulli, K., Horowitz, M., and Levoy, M. The frankencamera: an experimental platform for computational photography. ACM Transactions on Graphics 29, 4 (2010), 29:1--29:12.
[6]
Gomez, S. R. Interacting with live preview frames: in-picture cues for a digital camera interface. In Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology (2010), 419--420.
[7]
Chu, S., and Tanaka, J. Hand gesture for taking self portrait. In Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II (2011), 238--247.
[8]
Chu, S., and Tanaka, J. Head nod and shake gesture interface for a self-portrait camera. In ACHI 2012, The Fifth International Conference on Advances in Computer-Human Interactions (2012), 112--117.
[9]
Nikon coolpix s1200pj camera. http://www.nikonusa.com/en/Learn-And-Explore/Article/g022fmeg/Built-in-Projector.html.
[10]
Bayazit, M., Couture-beil, A., and Mori, G. Real-time motion-based gesture recognition using the GPU. In IAPR Conference on Machine Vision Applications (MVA) (2009), 9--12.
[11]
Wachs, J. P., Kolsch, M., Stern, H., and Edan, Y. Vision-based hand-gesture applications. Communications of the ACM 54, 2 (2011), 60--71.
[12]
Moeslund, T. B., Hilton, A., and Kruger, V. A survey of advances in vision-based human motion capture and analysis. Journal of Computer Vision and Image Understanding 104, 2 (2006), 90--126.
[13]
Wilson, A. D. Robust computer vision-based detection of pinching for one and two-handed gesture input. In Proceedings of the 19th annual ACM symposium on User interface software and technology (2006), 255--258.
[14]
Mistry, P., and Maes, P. Sixthsense: a wearable gestural interface. In ACM SIGGRAPH ASIA 2009 Sketches (2009), 11:1--11:1.
[15]
Lee, T., and Hollerer, T. Handy AR: Markerless inspection of augmented reality objects using fingertip tracking. In Wearable Computers, 2007 11th IEEE International Symposium on (2007), 1--8.
[16]
Wang, R., Paris, S., and Popovic, J. 6D hands: markerless hand-tracking for computer aided design. In Proceedings of the 24th annual ACM symposium on User interface software and technology (2011), 549--558.
[17]
Wang, R. Y., and Popovic, J. Real-time hand-tracking with a color glove. ACM Transactions on Graphics 28, 3 (2012), 63:1--63:8.
[18]
Stenger, B., Woodley, T., and Cipolla, R. A vision-based remote control. In Studies in Computational Intelligence, vol. 285 (2010), 233--262.
[19]
Baker, S., and Matthews, I. Lucas-kanade 20 years on: A unifying framework. International Journal of Computer Vision 56, 3 (2004), 221--255.
[20]
Bruhn, A., Weickert, J., and Schnorr, C. Lucas/kanade meets horn/schunck: combining local and global optic flow methods. International Journal of Computer Vision 61, 3 (2005), 211--231.
[21]
Chen, M., Mummert, L., Pillai, P., Hauptmann, A., and Sukthankar, R. Controlling your TV with gestures. In MIR 2010: 11th ACM SIGMM International Conference on Multimedia Information Retrieval (2010), 405--408.
[22]
Fathi, A., and Mori, G. Action recognition by learning mid-level motion features. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2008) (2008), 1--8.
[23]
Takahashi, M., Fujii, M., Naemura, M., and Satoh, S. Human gesture recognition using 3.5-dimensional trajectory features for hands-free user interface. In Proceedings of the first ACM international workshop on Analysis and retrieval of tracked events and motion in imagery streams (2010), 3--8.
[24]
Hardy, J., Rukzio, E., and Davies, N. Real world responses to interactive gesture based public displays. In Proceedings of the 10th International Conference on Mobile and Ubiquitous Multimedia (2011), 33--39.
[25]
Kim, J., Mastnik, S., and Andre, E. EMG-based hand gesture recognition for realtime biosignal interfacing. In Proceedings of the 13th international conference on Intelligent user interfaces (2008), 30--39.
[26]
Atia, A., and Tanaka, J. Interaction with tilting gestures in ubiquitous environments. International Journal of UbiComp 1, 3 (2010), 1--13.
[27]
Eisenstein, J., and Mackay, W. E. Interacting with communication appliances: an evaluation of two computer vision-based selection techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2006), 1111--1114.
[28]
Open source computer vision library (OpenCV). http://opencv.org/.
[29]
Canon digital camera software developers kit. http://usa.canon.com/cusa/consumer/standard_display/sdk_homepage.
[30]
Fiss, J., Agarwala, A., and Curless, B. Candid portrait selection from video. ACM Transactions on Graphics 30, 6 (2011), 128:1--128:8.

Cited By

View all
  • (2020)User-Defined Gestures for Taking Self-portraits with Smartphone Based on ConsistencyAdvances in Usability, User Experience, Wearable and Assistive Technology10.1007/978-3-030-51828-8_42(316-326)Online publication date: 1-Jul-2020
  • (2017)Pan-and-tilt self-portrait system using gesture interface2017 IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS)10.1109/ICIS.2017.7960063(599-605)Online publication date: May-2017
  • (2015)Design of a motion-based gestural menu-selection interface for a self-portrait cameraPersonal and Ubiquitous Computing10.1007/s00779-014-0776-119:2(415-424)Online publication date: 1-Feb-2015

Index Terms

  1. Interacting with a self-portrait camera using motion-based hand gestures

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    APCHI '13: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction
    September 2013
    420 pages
    ISBN:9781450322539
    DOI:10.1145/2525194
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 September 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. digital camera
    2. gesture user interface
    3. human computer interaction
    4. image processing
    5. motion gestures

    Qualifiers

    • Research-article

    Conference

    APCHI '13
    Sponsor:

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)User-Defined Gestures for Taking Self-portraits with Smartphone Based on ConsistencyAdvances in Usability, User Experience, Wearable and Assistive Technology10.1007/978-3-030-51828-8_42(316-326)Online publication date: 1-Jul-2020
    • (2017)Pan-and-tilt self-portrait system using gesture interface2017 IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS)10.1109/ICIS.2017.7960063(599-605)Online publication date: May-2017
    • (2015)Design of a motion-based gestural menu-selection interface for a self-portrait cameraPersonal and Ubiquitous Computing10.1007/s00779-014-0776-119:2(415-424)Online publication date: 1-Feb-2015

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media