Skip to main content

Efficient Marker Matching Using Pair-Wise Constraints in Physical Therapy

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6454))

Abstract

In this paper, we report a robust, efficient, and automatic method for matching infrared tracked markers for human motion analysis in computer-aided physical therapy applications. The challenges of this task stem from non-rigid marker motion, occlusion, and timing requirements. To overcome these difficulties, we use pair-wise distance constraints for marker identification. To meet the timing requirements, we first reduce the candidate marker labels by proximity constraints before enforcing the pair-wise constraints. Experiments with 38 real motion sequences, our method has shown superior accuracy and significant speedup over a semi-automatic proprietary method and the Iterative Closest Point (ICP) approach.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Besl, P.J., McKay, N.D.: A Method for Registration of 3-D Shapes. PAMI 14, 239–256 (1992)

    Article  Google Scholar 

  2. Broida, T.J., Chellappa, R.: Estimation of Object Motion Parameters from a Sequence of Noisy Images. PAMI 8, 90–99 (1986)

    Article  Google Scholar 

  3. Isard, M., Blake, A.: Condensation - conditional density propagation for visual tracking. IJCV 29, 5–28 (1998)

    Article  Google Scholar 

  4. Dorfmuller-Ulhaas, K.: Robust optical user motion tracking using a kalman filter. In: ACM VRST (2003)

    Google Scholar 

  5. Herda, L., Fua, P., Plankers, R., Boulic, R., Thalmann, D.: Skeleton-based motion capture for robust reconstruction of human motion. In: Proc. Computer Animation (2000)

    Google Scholar 

  6. Hornung, A., Sar-Dessai, S., Kobbelt, L.: Self-calibrating optical motion tracking for articulated bodies. IEEE Virtual Reality, 75–82 (2005)

    Google Scholar 

  7. Kato, H., Billinghurst, M.: Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Int’l Wshp on Augmented Reality, pp. 85–94 (1999)

    Google Scholar 

  8. Keshner, E.A., Kenyon, R.V.: Using immersive technology for postural research and rehabilitation. Assist Technol. Summer 16(1), 54–62 (2004)

    Article  Google Scholar 

  9. Kurihara, K., Hoshino, S., Yamane, K., Nakamura, Y.: Optical motion capture system with pan-tilt camera tracking and real time data processing. In: ICRA, vol. 2 (2002)

    Google Scholar 

  10. van Liere, R., van Rhijn, A.: Search space reduction in optical tracking. In: Proceedings of the Workshop on Virtual Environments, pp. 207–214 (2003)

    Google Scholar 

  11. Ringer, M., Lasenby, J.: A procedure for automatically estimating model parameters in optical motion capture. Image and Vision Computing 22, 843–850 (2004)

    Article  Google Scholar 

  12. Tolani, D., Goswami, A., Badler, N.I.: Real-time inverse kinematics techniques for anthropomorphic limbs. Graphical models 62, 353–388 (1999)

    Article  MATH  Google Scholar 

  13. Welch, G., Bishop, G., Vicci, L., Brumback, S., Keller, K.: The HiBall tracker: High-performance wide-area tracking for virtual and augmented environments. In: ACM VRST (1999)

    Google Scholar 

  14. Yilmaz, A., Javed, O., Shah, M.: Object Tracking: A Survey. ACM Computing Surveys 38(4) (2006)

    Google Scholar 

  15. Zordan, V.B., Van Der Horst, N.C.: Mapping optical motion capture data to skeletal motion using a physical model. In: ACM symp. on Computer Animation, pp. 245–250 (2003)

    Google Scholar 

  16. Salari, V., Sethi, I.K.: Feature point correspondence in the presence of occlusion. PAMI 12(1), 87–91 (1990)

    Article  Google Scholar 

  17. Sethi, I., Jain, R.: Finding trajectories of feature points in a monocular image sequence. PAMI 9(1), 56–73 (1987)

    Article  Google Scholar 

  18. Rangarajan, K., Shah, M.: Establishing motion correspondence. In: Conference Vision Graphies Image Process, vol. 54(1), pp. 56–73 (1991)

    Google Scholar 

  19. Intille, S., Davis, J., Bobick, A.: Real-time closed-world tracking. In: CVPR, pp. 697–703 (1997)

    Google Scholar 

  20. Veenman, C., Reinders, M., Backer, E.: Resolving motion correspondence for densely moving points. PAMI 23(1), 54–72 (2001)

    Article  Google Scholar 

  21. Shafique, K., Shah, M.: A non-iterative greedy algorithm for multi-frame point correspondence. In: ICCV, pp. 110–115 (2003)

    Google Scholar 

  22. Zhang, Z.: Iterative point matching for registration of free-form curves and surfaces. IJCV 13, 119–152 (1994)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Johnson, G., Xie, N., Slaboda, J., Shi, Y.J., Keshner, E., Ling, H. (2010). Efficient Marker Matching Using Pair-Wise Constraints in Physical Therapy. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2010. Lecture Notes in Computer Science, vol 6454. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17274-8_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-17274-8_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-17273-1

  • Online ISBN: 978-3-642-17274-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics