skip to main content
10.1145/2425836.2425880acmotherconferencesArticle/Chapter ViewAbstractPublication PagesivcnzConference Proceedingsconference-collections
poster

Physically interactive tabletop augmented reality using the Kinect

Published: 26 November 2012 Publication History

Abstract

In this paper we present a method for allowing arbitrary objects to interact physically in an augmented reality (AR) environment. A Microsoft Kinect is used to track objects in 6 degrees of freedom, enabling realistic interaction between them and virtual content in an tabletop AR context. We propose a point cloud based method for achieving such interaction. An adaptive per-pixel depth threshold is used to extract foreground objects, which are grouped using connected-component analysis. Objects are tracked with a variant of the Iterative Closest Point algorithm, which uses randomised projective correspondences. Our algorithm tracks objects moving at typical tabletop speeds with median drifts of 8.5% (rotational) and 4.8% (translational). The point cloud representation of foreground objects is improved as additional views of the object are visible to the Kinect. Physics-based AR interaction is achieved by fitting a collection of spheres to the point cloud model and passing them to the Bullet physics engine as a physics proxy of the object. Our method is demonstrated in an AR application where the user can interact with a virtual tennis ball, illustrating our proposed method's potential for physics-based AR interaction.

References

[1]
P. J. Besl and N. D. McKay. A method for registration of 3-d shapes. IEEE Trans. Pattern Anal. Mach. Intell., 14(2): 239--256, Feb. 1992.
[2]
M. Billinghurst. The Future of Augmented Reality in Our Everyday Life. In Proceedings of the 19th International Display Workshops, Nagoya, Japan, December 2011.
[3]
P. Buchanan, H. Seichter, M. Billinghurst, and R. Grasset. Augmented reality and rigid body simulation for edutainment: the interesting mechanism - an AR puzzle to teach Newton physics. In Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, ACE '08, pages 17--20, New York, NY, USA, 2008. ACM.
[4]
F. Chang, C.-J. Chen, and C.-J. Lu. A linear-time component-labeling algorithm using contour tracing technique. Comput. Vis. Image Underst., 93(2): 206--220, Feb. 2004.
[5]
Y. Chen and G. Medioni. Object modeling by registration of multiple range images. In Robotics and Automation, 1991. Proceedings., 1991 IEEE International Conference on, pages 2724--2729 vol. 3, apr 1991.
[6]
D. Comaniciu, V. Ramesh, and P. Meer. Kernel-based object tracking. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 25(5): 564--577, may 2003.
[7]
S. Izadi, R. A. Newcombe, D. Kim, O. Hilliges, D. Molyneaux, S. Hodges, P. Kohli, J. Shotton, A. J. Davison, and A. Fitzgibbon. KinectFusion: real-time dynamic 3D surface reconstruction and interaction. In ACM SIGGRAPH 2011 Talks, SIGGRAPH '11, pages 23:1--23:1, New York, NY, USA, 2011. ACM.
[8]
H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, and K. Tachibana. Virtual object manipulation on a table-top AR environment. Proceedings IEEE and ACM International Symposium on Augmented Reality ISAR 2000, pages 111--119, 2000.
[9]
C. Keskin, F. Kirac, Y. Kara, and L. Akarun. Real time hand pose estimation using depth sensors. In Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on, pages 1228--1234, Nov. 2011.
[10]
K. Khoshelham and S. O. Elberink. Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors, 12(2): 1437--1454, 2012.
[11]
K.-L. Low. Linear least-squares optimization for point-to-plane icp surface registration. Technical report, Department of Computer Science, University of North Carolina at Chapel Hill.
[12]
T. Ohshima, K. Satoh, H. Yamamoto, and H. Tamura. AR2 Hockey: A Case Study of Collaborative Augmented Reality. In Proc. IEEE VRAIS '98, pages 268--275, 1998.
[13]
T. Piumsomboon, A. Clark, and M. Billinghurst. Physically-based Interaction for Tabletop Augmented Reality Using a Depth-sensing Camera for Environment Mapping. In Proc. Image and Vision Computing New Zealand (IVCNZ-2011), pages 161--166, Dec 2011.
[14]
A. Rizzo and G. J. Kim. A SWOT analysis of the field of virtual reality rehabilitation and therapy. Presence: Teleoper. Virtual Environ., 14(2): 119--146, Apr. 2005.
[15]
S. Rusinkiewicz and M. Levoy. Efficient variants of the ICP algorithm. In 3-D Digital Imaging and Modeling, 2001. Proceedings. Third International Conference on, pages 145--152. IEEE, 2001.
[16]
J. Shotton, A. W. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman, and A. Blake. Real-time human pose recognition in parts from single depth images. In CVPR, pages 1297--1304. IEEE, 2011.
[17]
P. Song, H. Yu, and S. Winkler. Vision-based 3D finger interactions for mixed reality games with physics simulation. In Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, VRCAI '08, pages 7:1--7:6, New York, NY, USA, 2008. ACM.
[18]
H. Uchiyama and E. Marchand. Object detection and pose tracking for augmented reality: Recent approaches. In 18th Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), Feb. 2012.
[19]
R. Y. Wang and J. Popović. Real-time hand-tracking with a color glove. In ACM SIGGRAPH 2009 papers, SIGGRAPH '09, pages 63:1--63:8, New York, NY, USA, 2009. ACM.
[20]
A. D. Wilson. Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction. Second Annual IEEE International Workshop on Horizontal Interactive HumanComputer Systems TABLETOP07, 106(5): 201--204, 2007.

Cited By

View all
  • (2020)An augmented reality-based training system with a natural user interface for manual milling operationsVirtual Reality10.1007/s10055-019-00415-824:3(527-539)Online publication date: 1-Sep-2020
  • (2017)Inertial Navigation algorithms2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)10.1109/PERCOMW.2017.7917510(14-17)Online publication date: Mar-2017
  • (2015)First-person view animation editing utilizing video see-through augmented realityACM SIGGRAPH 2015 Posters10.1145/2787626.2787656(1-1)Online publication date: 31-Jul-2015
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
IVCNZ '12: Proceedings of the 27th Conference on Image and Vision Computing New Zealand
November 2012
547 pages
ISBN:9781450314732
DOI:10.1145/2425836
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • HRS: Hoare Research Software Ltd.
  • Google Inc.
  • Dept. of Information Science, Univ.of Otago: Department of Information Science, University of Otago, Dunedin, New Zealand

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 November 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. interaction
  3. physical simulation

Qualifiers

  • Poster

Conference

IVCNZ '12
Sponsor:
  • HRS
  • Dept. of Information Science, Univ.of Otago
IVCNZ '12: Image and Vision Computing New Zealand
November 26 - 28, 2012
Dunedin, New Zealand

Acceptance Rates

Overall Acceptance Rate 55 of 74 submissions, 74%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)1
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2020)An augmented reality-based training system with a natural user interface for manual milling operationsVirtual Reality10.1007/s10055-019-00415-824:3(527-539)Online publication date: 1-Sep-2020
  • (2017)Inertial Navigation algorithms2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)10.1109/PERCOMW.2017.7917510(14-17)Online publication date: Mar-2017
  • (2015)First-person view animation editing utilizing video see-through augmented realityACM SIGGRAPH 2015 Posters10.1145/2787626.2787656(1-1)Online publication date: 31-Jul-2015
  • (2014)A Motion-Detection Biology-Based Learning Game for ChildrenIEEE Potentials10.1109/MPOT.2013.229569333:6(31-36)Online publication date: Nov-2014
  • (2013)Free-hands interaction in augmented realityProceedings of the 1st symposium on Spatial user interaction10.1145/2491367.2491370(33-40)Online publication date: 20-Jul-2013

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media