Skip to main content

A Motion Tutoring System by Using Virtual-Robot and Sensors

  • Conference paper
  • 1019 Accesses

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 212))

Abstract

This paper describes how the user can determine the exactness of his motion by using proposed tutoring system based on virtual robot without any other’s help. Sensors in the user’s cloth measure the positions of the joints. The main PC gathers these data and shows the user’s motion by using virtual robot. Tutoring system in PC compares between the database of exemplary motion and the user’s motion and gives feedback to the user.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Semwal, S.K., Hightower, R., Stansfield, S.: Mapping Algorithms for Real-Time Control of an Avatar Using Eight Sensors. Presence 7(1), 1–21 (1998)

    Article  Google Scholar 

  2. Lee, S.W., Mase, K.: Activity and Location Recognition Using Wearable Sensors. IEEE Pervasive Computing, 24–32 (2002)

    Google Scholar 

  3. Pentland, A.: Looking at People: Sensing for Ubiquitous and Wearable Computing. IEEE Trans. on Pattern Analysis and Machine Intelligence, 107–119 (2000)

    Google Scholar 

  4. Starner, T., Mann, S., Rhodes, B., Levine, J., Healey, J., Kirsch, D., Picard, R., Pentland, A.: Augmented Reality Through Wearable Computing. Presence: Teleoper. Virtual Environ. (1997)

    Google Scholar 

  5. Kern, N., Schiele, B., Schmidt, A.: Multi-sensor Activity Context Detection for Wearable Computing. In: Aarts, E., Collier, R.W., van Loenen, E., de Ruyter, B. (eds.) EUSAI 2003. LNCS, vol. 2875, pp. 220–232. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  6. Wei, X., Chai, J.: VideoMocap: Modeling Physically Realistic Human Motion from Monocular Video Sequences. In: ACM TOG (Proc. SIGGRAPH 2010) (2010)

    Google Scholar 

  7. Using Quaternions to Represent Rotation, http://gpwiki.org/index.php/OpenGL:Tutorials:Using_Quaternions_to_represent_rotation

  8. Creating an OpenGL View on a Windows Form, http://www.codeproject.com/KB/miscctrl/OpenGLViewWinForms.aspx

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, TJ., Lee, KT., Kim, NH. (2011). A Motion Tutoring System by Using Virtual-Robot and Sensors. In: Li, TH.S., et al. Next Wave in Robotics. FIRA 2011. Communications in Computer and Information Science, vol 212. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23147-6_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23147-6_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23146-9

  • Online ISBN: 978-3-642-23147-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics