skip to main content
10.1145/3460418.3480161acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

ARToken: A Tangible Device for Dynamically Binding Real-world Objects with Virtual Representation

Published: 24 September 2021 Publication History

Abstract

Users are eager to interact with and control virtual proxies in the reconstructed virtual world using physical objects with haptic feedback since the development of AR/VR. Therefore, we propose ARToken, a device that virtualizes physical objects through AR tracking, and then links the relative positions of the Token device and virtual representations through its system, generating interactive virtual environments based on physical environments. In addition to tracking position through AR images, when the object exceeds the recognition range of AR Image, the IMU sensor can be used to detect rotation and displacement. ARToken sends this information through WiFi, allowing for real-time feedback of the state of the real object to the corresponding virtual representation. Also, this system allows for simultaneous tracking of multiple devices. To verify our design, we have produced two applications that exhibit how ARToken aids in generating virtual environments based on physical environments and links virtual and physical objects as one.

References

[1]
Bruno Araujo, Ricardo Jota, Varun Perumal, Jia Xian Yao, Karan Singh, and Daniel Wigdor. 2016. Snake Charmer: Physically enabling virtual objects. In Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. 218–226.
[2]
Arpan Chakraborty, Ryan Gross, Shea McIntee, Kyung Wha Hong, Jae Yeol Lee, and Robert St. Amant. 2014. Captive: a cube with augmented physical tools. In CHI’14 Extended Abstracts on Human Factors in Computing Systems. 1315–1320.
[3]
Lung-Pan Cheng, Eyal Ofek, Christian Holz, Hrvoje Benko, and Andrew D Wilson. 2017. Sparse haptic proxy: Touch feedback in virtual environments using a general passive prop. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3718–3728.
[4]
Ruofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian, Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, 2020. DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 829–843.
[5]
Martin A Fischler and Robert C Bolles. 1981. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24, 6 (1981), 381–395.
[6]
Hsin-Yu Huang, Chih-Wei Ning, Po-Yao Wang, Jen-Hao Cheng, and Lung-Pan Cheng. 2020. Haptic-Go-Round: a surrounding platform for encounter-type haptics in virtual reality experiences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–10.
[7]
Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison, 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 559–568.
[8]
Hirokazu Kato and Mark Billinghurst. 1999. Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99). IEEE, 85–94.
[9]
Johnny Lee. 2017. 4-1: Invited Paper: Mobile AR in Your Pocket with Google Tango. In SID Symposium Digest of Technical Papers, Vol. 48. Wiley Online Library, 17–18.
[10]
Peter Ondrúška, Pushmeet Kohli, and Shahram Izadi. 2015. Mobilefusion: Real-time volumetric surface reconstruction and dense tracking on mobile phones. IEEE transactions on visualization and computer graphics 21, 11(2015), 1251–1258.
[11]
Sergio Orts-Escolano, Christoph Rhemann, Sean Fanello, Wayne Chang, Adarsh Kowdle, Yury Degtyarev, David Kim, Philip L Davidson, Sameh Khamis, Mingsong Dou, 2016. Holoportation: Virtual 3d teleportation in real-time. In Proceedings of the 29th annual symposium on user interface software and technology. 741–754.
[12]
Ryo Suzuki, Hooman Hedayati, Clement Zheng, James L Bohn, Daniel Szafir, Ellen Yi-Luen Do, Mark D Gross, and Daniel Leithinger. 2020. Roomshift: Room-scale dynamic haptics for vr with furniture-moving swarm robots. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–11.
[13]
Julien Valentin, Adarsh Kowdle, Jonathan T Barron, Neal Wadhwa, Max Dzitsiuk, Michael Schoenberg, Vivek Verma, Ambrus Csaszar, Eric Turner, Ivan Dryanovski, 2018. Depth from motion for smartphone AR. ACM Transactions on Graphics (ToG) 37, 6 (2018), 1–19.
[14]
Keng-Ta Yang, Chiu-Hsuan Wang, and Liwei Chan. 2018. Sharespace: Facilitating shared use of the physical space by both vr head-mounted display and external users. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 499–509.
[15]
Shang-Ta Yang, Fu-En Wang, Chi-Han Peng, Peter Wonka, Min Sun, and Hung-Kuo Chu. 2019. Dula-net: A dual-projection network for estimating room layouts from a single rgb panorama. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 3363–3372.
[16]
Yan Yixian, Kazuki Takashima, Anthony Tang, Takayuki Tanno, Kazuyuki Fujita, and Yoshifumi Kitamura. 2020. ZoomWalls: Dynamic Walls that Simulate Haptic Infrastructure for Room-scale VR World. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 223–235.
[17]
Yinda Zhang, Shuran Song, Ping Tan, and Jianxiong Xiao. 2014. Panocontext: A whole-room 3d context model for panoramic scene understanding. In European conference on computer vision. Springer, 668–686.
[18]
Chuhang Zou, Alex Colburn, Qi Shan, and Derek Hoiem. 2018. LayoutNet: Reconstructing the 3D Room Layout From a Single RGB Image. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '21 Adjunct: Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers
September 2021
711 pages
ISBN:9781450384612
DOI:10.1145/3460418
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 September 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Augmented Reality
  2. Immersive experience.
  3. User interface toolkits

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

UbiComp '21

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 179
    Total Downloads
  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media