skip to main content
research-article

Motion reconstruction using sparse accelerometer data

Published: 19 May 2011 Publication History

Abstract

The development of methods and tools for the generation of visually appealing motion sequences using prerecorded motion capture data has become an important research area in computer animation. In particular, data-driven approaches have been used for reconstructing high-dimensional motion sequences from low-dimensional control signals. In this article, we contribute to this strand of research by introducing a novel framework for generating full-body animations controlled by only four 3D accelerometers that are attached to the extremities of a human actor. Our approach relies on a knowledge base that consists of a large number of motion clips obtained from marker-based motion capturing. Based on the sparse accelerometer input a cross-domain retrieval procedure is applied to build up a lazy neighborhood graph in an online fashion. This graph structure points to suitable motion fragments in the knowledge base, which are then used in the reconstruction step. Supported by a kd-tree index structure, our procedure scales to even large datasets consisting of millions of frames. Our combined approach allows for reconstructing visually plausible continuous motion streams, even in the presence of moderate tempo variations which may not be directly reflected by the given knowledge base.

Supplementary Material

Tautges (tautges.zip)
Supplemental movie and image files for, Motion Reconstruction Using Sparse Accelerometer Data
MP4 File (tp003_11.mp4)

References

[1]
Andoni, A. and Indyk, P. 2008. Near-Optimal hashing algorithms for approximate nearest neighbor in high dimensions. Commun. ACM 51, 1, 117--122.
[2]
Arikan, O., Forsyth, D. A., and O'Brien, J. F. 2003. Motion synthesis from annotations. ACM Trans. Graph. 22, 3, 402--408.
[3]
Badler, N. I., Hollick, M. J., and Granieri, J. P. 1993. Real-Time control of a virtual human using minimal sensors. Presence: Teleoper. Virtual Environ. 1, 82--86.
[4]
Chai, J. and Hodgins, J. K. 2005. Performance animation from low-dimensional control signals. ACM Trans. Graph. 24, 3, 686--696.
[5]
Cooper, S., Hertzmann, A., and Popović, Z. 2007. Active learning for real-time motion controllers. ACM Trans. Graph. 26, 3, 5.
[6]
Dontcheva, M., Yngve, G., and Popović, Z. 2003. Layered acting for character animation. ACM Trans. Graph. 22, 409--416.
[7]
Feng, W.-W., Kim, B.-U., and Yu, Y. 2008. Real-Time data driven deformation using kernel canonical correlation analysis. ACM Trans. Graph. 27, 91:1--91:9.
[8]
Kelly, P., Conaire, C. O., Hodgins, J., and O'Conner, N. E. 2010. Human motion reconstruction using wearable accelerometers (poster). In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA).
[9]
Keogh, E., Palpanas, T., Zordan, V. B., Gunopulos, D., and Cardle, M. 2004. Indexing large human-motion databases. In Proceedings of the 30th International Conference on Very Large Data Bases (VLDB'04). VLDB Endowment, 780--791.
[10]
Kovar, L. and Gleicher, M. 2003. Flexible automatic motion blending with registration curves. In Proceedings of the Eurographics/SIGGRAPH Symposium on Computer Animation. D. Breen and M. Lin Eds., Eurographics Association, 214--224.
[11]
Krüger, B., Tautges, J., Weber, A., and Zinke, A. 2010. Fast local and global similarity searches in large motion capture databases. In Proceedings of the Eurographics/ACM SIGGRAPH Symposium on Computer Animation. 1--10.
[12]
Lee, Y., Wampler, K., Bernstein, G., Popović, J., and Popović, Z. 2010. Motion fields for interactive character locomotion. ACM Trans. Graph. 29, 138:1--138:8.
[13]
Maiocchi, R. 1996. 3-D Character Animation Using Motion Capture. Prentice-Hall, Upper Saddle River, NJ.
[14]
Moeslund, T. B., Hilton, A., and Krüger, V. 2006. A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104, 2, 90--126.
[15]
Müller, M., Röder, T., and Clausen, M. 2005. Efficient content-based retrieval of motion capture data. ACM Trans. Graph. 24, 677--685.
[16]
Müller, M., Röder, T., Clausen, M., Eberhardt, B., Krüger, B., and Weber, A. 2007. Documentation: Mocap database HDM05. Computer graphics Tech. rep. CG-2007-2, Universität Bonn. June. http://www.mpi-inf.mpg.de/resources/HDM05.
[17]
Nike. 2010. Nike homepage. http://www.nike.com. (3/10.)
[18]
Nintendo. 2010. Nintendo homepage. http://www.nintendo.com. (3/10.)
[19]
Oore, S., Terzopoulos, D., and Hinton, G. 2002. A desktop input device and interface for interactive 3d character animation. In Proceedings of Graphics Interface Conference (GI'02). 133--140.
[20]
PhaseSpace. 2010. PhaseSpace motion capture. http://www.phasespace.com. (3/10.)
[21]
Pullen, K. and Bregler, C. 2002. Motion capture assisted animation: Texturing and synthesis. ACM Trans. Graph. 21, 501--508.
[22]
Schepers, H. M., Roetenberg, D., and Veltink, P. H. 2010. Ambulatory human motion tracking by fusion of inertial and magnetic sensing with adaptive actuation. Med. Biol. Engin. Comput. 48, 1, 27--37.
[23]
Shin, H. J. and Lee, J. 2006. Motion synthesis and editing in low-dimensional spaces: Research articles. Comput. Animat. Virtual Worlds 17, 219--227.
[24]
Shin, H. J., Lee, J., Shin, S. Y., and Gleicher, M. 2001. Computer puppetry: An importance-Based approach. ACM Trans. Graph. 20, 67--94.
[25]
Shiratori, T. and Hodgins, J. K. 2008. Accelerometer-Based user interfaces for the control of a physically simulated character. ACM Trans. Graph. 27, 123:1--123:9.
[26]
Slyper, R. and Hodgins, J. 2008. Action capture with accelerometers. In Proceedings of the ACM/Eurographics Symposium on Computer Animation.
[27]
Sok, K. W., Kim, M., and Lee, J. 2007. Simulating biped behaviors from human motion data. ACM Trans. Graph. 26, 3, Article 107.
[28]
Tournier, M., Wu, X., Courty, N., Arnaud, E., and Revéret, L. 2009. Motion compression using principal geodesics analysis. Comput. Graph. Forum 28, 2, 355--364.
[29]
Vicon. 2010. Motion capture systems from vicon. http://www.vicon.com. (3/10.)
[30]
Vlasic, D., Adelsberger, R., Vannucci, G., Barnwell, J., Gross, M., Matusik, W., and Popović, J. 2007. Practical motion capture in everyday surroundings. ACM Trans. Graph. 26.
[31]
Wikipedia. 2010. Motion capture. http://en.wikipedia.org/wiki/Motion_capture. (3/10.)
[32]
Xsens. 2010. 3D motion tracking. http://www.xsens.com. (3/10.)

Cited By

View all
  • (2024)Spatial-related sensors mattersProceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence and Thirty-Sixth Conference on Innovative Applications of Artificial Intelligence and Fourteenth Symposium on Educational Advances in Artificial Intelligence10.1609/aaai.v38i9.28888(10225-10233)Online publication date: 20-Feb-2024
  • (2024)EgoHDM: A Real-time Egocentric-Inertial Human Motion Capture, Localization, and Dense Mapping SystemACM Transactions on Graphics10.1145/368790743:6(1-12)Online publication date: 19-Dec-2024
  • (2024)WheelPoser: Sparse-IMU Based Body Pose Estimation for Wheelchair UsersProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675638(1-17)Online publication date: 27-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Graphics
ACM Transactions on Graphics  Volume 30, Issue 3
May 2011
127 pages
ISSN:0730-0301
EISSN:1557-7368
DOI:10.1145/1966394
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 May 2011
Accepted: 01 February 2011
Received: 01 December 2010
Published in TOG Volume 30, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Motion capture
  2. acceleration data
  3. motion reconstruction
  4. motion retrieval
  5. online control

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)63
  • Downloads (Last 6 weeks)7
Reflects downloads up to 23 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Spatial-related sensors mattersProceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence and Thirty-Sixth Conference on Innovative Applications of Artificial Intelligence and Fourteenth Symposium on Educational Advances in Artificial Intelligence10.1609/aaai.v38i9.28888(10225-10233)Online publication date: 20-Feb-2024
  • (2024)EgoHDM: A Real-time Egocentric-Inertial Human Motion Capture, Localization, and Dense Mapping SystemACM Transactions on Graphics10.1145/368790743:6(1-12)Online publication date: 19-Dec-2024
  • (2024)WheelPoser: Sparse-IMU Based Body Pose Estimation for Wheelchair UsersProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675638(1-17)Online publication date: 27-Oct-2024
  • (2024)Head Pose Estimation Using a Chest-Mounted Camera and Its Evaluation Based on CG Images2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems (SCIS&ISIS)10.1109/SCISISIS61014.2024.10760076(1-4)Online publication date: 9-Nov-2024
  • (2024)DiffusionPoser: Real-Time Human Motion Reconstruction From Arbitrary Sparse Sensors Using Autoregressive Diffusion2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00243(2513-2523)Online publication date: 16-Jun-2024
  • (2024)Dynamic Inertial Poser (DynaIP): Part-Based Motion Dynamics Learning for Enhanced Human Pose Estimation with Sparse Inertial Sensors2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00185(1889-1899)Online publication date: 16-Jun-2024
  • (2024)Fast Human Motion reconstruction from sparse inertial measurement units considering the human shapeNature Communications10.1038/s41467-024-46662-515:1Online publication date: 18-Mar-2024
  • (2024)Digitizing traditional dances under extreme clothing: The case study of EyoJournal of Cultural Heritage10.1016/j.culher.2024.02.01167(145-157)Online publication date: May-2024
  • (2024)DTP: learning to estimate full-body pose in real-time from sparse VR sensor measurementsVirtual Reality10.1007/s10055-024-01011-128:2Online publication date: 23-May-2024
  • (2024)A U-Shaped Spatio-Temporal Transformer as Solver for Motion CaptureComputational Visual Media10.1007/978-981-97-2095-8_15(274-294)Online publication date: 10-Apr-2024
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media