Skip to main content
Log in

Augmented reality displaying scheme in a smart glass based on relative object positions and orientation sensors

  • Published:
World Wide Web Aims and scope Submit manuscript

Abstract

To interactively browse augmented reality (AR) content from a smart glass and to display the proper 3D visual contents on a smart glass are challenging research issues. In this paper, we propose to use a depth camera to detect a human subject in a real 3D space, and orientation sensors on a smart glass are used to reveal the attitude and orientations of a user’s head for pose estimation in an AR application. By implementing a prototype for detecting a user’s head and measuring the orientations of the head, the proposed method provides three contributions: (i) a top-view depth camera is used to detect a user’s head position, (ii) the orientation sensors on a smart glass are used to reveal the attitude and orientation properties of the head, and (iii) the displayed AR content in a virtual space is properly mapped from a real 3D space. The experimental results demonstrated the spatial displaying accuracy in three testing spaces: a research lab, an office, and the center for art and technology. In addition, the proposed method is applied in a tech-art installation to allow audience to reliably view the AR content in a smart glass.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13
Figure 14

Similar content being viewed by others

References

  1. Azuma, R.: A survey of augmented reality. Presence: Teleoperators Virt Environ 6(4), 355–385 (1997)

    Article  Google Scholar 

  2. Baum, M., Faion, F., Hanebeck, U.D.: Tracking ground moving extended objects using RGBD data. In: IEEE International Conf on Multisensor Fusion and Integration for Intelligent Systems (2012)

  3. Benatti, S., Casamassima, F., Milosevic, B., Farella, E., Schonle, P., Fateh, S., Burger, T., Huang, Q., Benini, L.: A versatile embedded platform for EMG acquisition and gesture recognition. IEEE Trans. Biomed. Circuits Syst. 9(5), 620–630 (2015)

    Article  Google Scholar 

  4. Brostow, G.J., Cipolla, R.: Unsupervised Bayesian detection of independent motion in crowds. In: IEEE Conf on Computer Vision and Pattern Recognition (2006)

  5. Child growth standards from World Health Organization (WHO), http://www.who.int/childgrowth/en/

  6. Donoser, M., Kontschieder, P., Bischof, H.: Robust planar target tracking and pose estimation from a single concavity. In: IEEE Int. Symp on Mixed and Augmented Reality (2011)

  7. El-Khoury, S., Batzianoulis, I., Antuvan, C.W., Contu, S., Masia, L., Micera, S., Billard, A.: EMG-based learning aproach for estimating wrist motion. In: IEEE Intl. Conf Engineering in Medicine and Biology Society (EMBC), pp. 6732–6735 (2015)

  8. Epson BT-200, https://tech.moverio.epson.com/en/bt-200/

  9. Epson BT-200 SDK, https://tech.moverio.epson.com/en/bt-200/tools.html

  10. Eshel, R., Moses, Y.: Homography based multiple camera detection and tracking of people in a dense crowd. In: IEEE Conf on Computer Vision and Pattern Recognition (2008)

  11. Gupta, H.P., Chudgar, H.S., Mukherjee, S., Dutta, T., Sharma, K.: A continuous hand gestures recognition technique for human-machine interaction using accelerometer and gyroscope sensors. IEEE Sensors J. 16(16), 6425–6432 (2016)

    Article  Google Scholar 

  12. Kim, K., Lepetit, V., Woo, W.: Scalable real-time planar targets tracking for digilog books. Vis. Comput. 26(6–8), 1145–1154 (2010)

    Article  Google Scholar 

  13. Kinect, https://www.xbox.com/en-US/xbox-one/accessories/kinect

  14. Kinect SDK, https://www.microsoft.com/en-us/download/details.aspx?id=44561

  15. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: IEEE/ACM International symposium on mixed and augmented reality (2007)

  16. Lan, Y.S., Sun, S.W., Hua, K.L., Cheng, W.H.: O-Displaying: an orientation-based augmented reality display on a smart glass with a user tracking. ACM SIGGRAPH Asia (2017)

  17. Liarokapis, M.V., Artemiadis, P.K., Kyriakopoulos, K.J., Manolakos, E.S.: A learning scheme for reach to grasp movements: on EMG-based interfaces using task specific motion decoding models. IEEE J. Biomed. Health Inform. 17(5), 915–921 (2013)

    Article  Google Scholar 

  18. Lowe, D.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  19. Luber, M., Spinello, L., Arras, K.O.: People tracking in RGB-D data with on-line boosted target models. In: IEEE International Conf on Intelligent Robots and Systems (2011)

  20. Lv, Z., Halawani, A., Feng, S., Rhman, S., Li, H.: Touch-less interactive augmented reality game on vision-based wearable device. Personal Ubiq. Comput. 19 (3–4), 551–567 (2015)

    Article  Google Scholar 

  21. Marchand, E., Uchiyama, H., Spindler, F.: Pose estimation for augmented reality: a hands-on survey. IEEE Trans. Visual. Comput. Graph. 22(12), 2633–2651 (2016)

    Article  Google Scholar 

  22. Martin, P., Marchand, E., Houlier, P., Marchal, I.: Mapping and re-localization for mobile augmented reality. IEEE Int. Conf on Image Processing (2014)

  23. Matas, J., Chum, O., Urban, M., Pajdla, T.: Robust wide baseline stereo from maximally stable extremal regions. British Mach. Vis. Conf., 384–396 (2002)

  24. Ozturk, O., Yamasaki, T., Aizawa, K.: Tracking of humans and estimation of body/head orientation from top-view single camera for visual focus of attention analysis. IEEE Intl. Conf on Computer Vision (2009)

  25. Park, Y., Lepetit, V., Woo, W.: Handling motion-blur in 3d tracking and rendering for augmented reality, vol. 18 (2012)

  26. Santiago, C.B., Sousa, A., Reis, L.P., Estriga, M.L.: Real time colour based player tracking in indoor sports. Computational Vision and Medical Image Processing: Recent Trends (Computational Methods in Applied Sciences), 17–35 (2011)

  27. Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A.: Real-time human pose recognition in parts from single depth images. In: IEEE Conf on Computer Vision and Pattern Recognition (2011)

  28. Solmaz, B., Moore, B.E., Shah, M.: Identifying behaviors in crowd scenes using stability analysis for dynamical systems. IEEE Trans. Pattern Anal. Mach. Intell. 34(10), 2064–2070 (2012)

    Article  Google Scholar 

  29. Su, Y., Fisher, M.H., Wolczowski, A., Bell, G.D., Burn, D.J., Gao, R.X.: Towards an EMG-controlled prosthetic hand using a 3-D electromagnetic positioning system. IEEE Trans. Instrum. Meas. 56(1), 178–186 (2007)

    Article  Google Scholar 

  30. Sun, S.W., Cheng, W.H., Lin, Y.C., Lin, W.C., Chang, Y.T., Peng, C.W.: Whac-a-Mole: a head detection scheme by estimating the 3D envelope from depth image. In: IEEE Intl. Conf. on Multimedia and Expo (2013)

  31. Tamaazousti, M., Gay-Bellile, V., Collette, S., Bourgeois, S., Dhome, M.: Nonlinear refinement of structure from motion reconstruction by taking advantage of a partial knowledge of the evironment. In: IEEE Conf. on Computer Vision and Pattern Recognition, pp. 3073–3080 (2011)

  32. Vuforia, https://developer.vuforia.com/

  33. Xie, R., Cao, J.: Accelerometer-based hand gesture recognition by neural network and similarity matching. IEEE Sensors J. 16(11), 4537–4545 (2016)

    Article  Google Scholar 

  34. Xu, R., Zhou, S., Li, W.J.: MEMS accelerometer based nonspecific-user hand gesture recognition. IEEE Sensors J. 12(5), 1166–1173 (2012)

    Article  Google Scholar 

  35. Xu, R., Zhou, S., Li, W.J.: Home Automation Oriented Gesture Classification from Inertial Measurements. IEEE Trans. Autom. Sci. Eng. 12(4), 1200–1210 (2015)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported in part by the Ministry of Science and Technology, Taiwan, under Grant MOST 106-2221-E-119-002.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shih-Wei Sun.

Additional information

This article belongs to the Topical Collection: Special Issue on Social Media and Interactive Technologies

Guest Editors: Timothy K. Shih, Lin Hui, Somchoke Ruengittinun, and Qing Li

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, SW., Lan, YS. Augmented reality displaying scheme in a smart glass based on relative object positions and orientation sensors. World Wide Web 22, 1221–1239 (2019). https://doi.org/10.1007/s11280-018-0592-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11280-018-0592-z

Keywords

Navigation