Skip to main content

Four Metamorphosis States in a Distributed Virtual (TV) Studio: Human, Cyborg, Avatar, and Bot – Markerless Tracking and Feedback for Realtime Animation Control

  • Chapter
  • First Online:
Virtual Realities

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8844))

  • 2259 Accesses

Abstract

The major challenge in virtual studio technology is the interaction between actors and virtual objects. Virtual studios differ from other virtual environments because there always exist two concurrent views: The view of the tv consumer and the view of the talent in front of the camera. This paper illustrates the interaction and feedback in front of the camera and compares different markerless person tracking systems, which are used for realtime animations. Entertaining animations are required, but sensors usually provide only a limited number of parameters. Additional information based on the context allows the generation of appealing animations, which might be partly prefabricated. As main example, we use a distributed live production in a virtual studio with two locally separated markerless tracking systems. The production was based on a fully tracked actor, cyborg (half actor, half graphics), avatar, and a bot. All participants could interact and throw a virtual disc. This setup is compared and mapped to Milgram’s continuum and technical challenges are described in detail.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Display virtual elements in front or behind the actor, dependent on her/his tracked position.

  2. 2.

    Comparison of 3D imaging technologies issued by Texas Instruments can be found on http://www.ti.com/ww/en/analog/3dtof/.

  3. 3.

    http://vsvr.medien.fh-duesseldorf.de/productions/vron2013/

  4. 4.

    https://github.com/fubyo/osccalibrator.

References

  1. Herder, J.: Interactive content creation with virtual set environments. J. 3D-Forum Soc. 15(4), 53–56 (2001). Japan

    Google Scholar 

  2. Gibbs, S., Arapis, C., Breiteneder, C., Lalioti, V., Mostafawy, S., Speier, J.: Virtual studios: an overview. IEEE Multimedia 5(1), 18–35 (1998)

    Article  Google Scholar 

  3. Bunsen, O.: Verteilte Virtuelle TV-Produktion im Gigabit-Testbed West. Abschließender Bericht, Laboratory for Mixed Realities, Institut an der Kunsthochschule für Medien Köln, GMD Forschungszentrum Informationstechnik GmbH Institut für Medienkommunikation, February 2000

    Google Scholar 

  4. Grau, O., Pullen, T., Thomas, G.: A combined studio production system for 3-d capturing of live action and immersive actor feedback. IEEE Trans. Circuits Syst. Video Technol. 14(3), 370–380 (2004)

    Article  Google Scholar 

  5. Ray, A., Blinn, J.F.: Blue screen matting. In: SIGGRAPH’98, Conference Proceeding, pp. 259–268 (1996)

    Google Scholar 

  6. Corazza, S., Mndermann, L., Chaudhari, A., Demattio, T., Cobelli, C., Andriacchi, T.: A markerless motion capture system to study musculoskeletal biomechanics: visual hull and simulated annealing approachD. Ann. Biomed. Eng. 34(6), 1019–1029 (2006)

    Article  Google Scholar 

  7. Vizrt: Kenziko and mammoth graphics at IBC 2012: Kinetrak 2012. http://www.vizrt.com/news/newsgrid/35347/Kenziko_and_Mammoth_Graphics_at_IBC_2012

  8. Mammoth Graphics and Kenziko Ltd.: Kinetrak (2014). http://www.kinetrak.tv

  9. Price, M., Thomas, G.A.: 3d virtual production and delivery using mpeg-4. In: International Broadcasting Convention (IBC). IEEE (2000)

    Google Scholar 

  10. Gibbs, S., Baudisch, P.: Interaction in the virtual studio. In: SIGGRAPH Computer Graphics, vol. 30, pp. 29–32. ACM Press, New York, November 1996. ISSN:0097–8930

    Google Scholar 

  11. Kim, N., Woo, W., Kim, G., Park, C.M.: 3-d virtual studio for natural inter-“acting”. IEEE Trans. Syst. Man Cybern. Part A: Syst. Hum. 36(4), 758–773 (2006)

    Article  Google Scholar 

  12. Wöldecke, B., Marinos, D., Pogscheba, P., Geiger, C., Herder, J., Schwirten, T.: radarTHEREMIN - creating musicalexpressionsin a virtual studio environment. In: Proceeding of ISVRI 2011 (International Symposium on VR Innovation), Singapore, pp. 345–346 (2009)

    Google Scholar 

  13. Marinos, D., Geiger, C., Herder, J.: Large-area moderator tracking and demonstrational configuration of position based interactions for virtual studios. In: 10th European Interactive TV Conference, Berlin (2012)

    Google Scholar 

  14. Herder, J., Wilke, M., Heimbach, J., Göbel, S., Marinos, D.: Simple actor tracking for virtual tv studios using a photonic mixing device. In: 12th International Conference on Human and Computer, Hamamatsu / Aizu-Wakamatsu / Düsseldorf, University of Aizu (2009)

    Google Scholar 

  15. Flasko, M., Pogscheba, P., Herder, J., Vonolfen, W.: Heterogeneous binocular camera-tracking in a virtual studio. In: 8. Workshop Virtuelle und Erweiterte RealitŁt der GI-Fachgruppe VR/AR, Wedel (2011)

    Google Scholar 

  16. Hough, G., Athwal, C., Williams, I.: Advanced occlusion handling for virtual studios. In: Lee, G., Howard, D., Kang, J.J., Skezak, D. (eds.) ICHIT 2012. LNCS, vol. 7425, pp. 287–294. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  17. Carranza, J., Theobalt, C., Magnor, M.A., Seidel, H.P.: Free-viewpoint video of human actors. ACM Trans. Graph. 22(3), 569–577 (2003)

    Article  Google Scholar 

  18. Brooks, A., Czarowicz, A.: Markerless motion tracking: Ms kinect & organic motion openstage. In: 9th International Conference on Disability, Virtual Reality and Associated Technologies, vol. 9, ICDVRAT and The University of Reading, pp. 435–437 (2012)

    Google Scholar 

  19. Organic Motion Inc.: Openstage 2.0 technical overview, June 2014. http://www.organicmotion.com/openstage-2-0-technical-overview/

  20. Livingston, M., Sebastian, J., Ai, Z., Decker, J.: Performance measurements for the microsoft kinect skeleton. In: Virtual Reality Short Papers and Posters (VRW), pp. 119–120. IEEE (2012)

    Google Scholar 

  21. Microsoft: Kinect for windows sdk documentation, July 2014

    Google Scholar 

  22. Daemen, J., Haufs-Brusberg, P., Herder, J.: Markerless actor tracking for virtual (tv) studio applications. In: International Joint Conference on Awareness Science and Technology & Ubi-Media Computing, iCAST 2013 & UMEDIA 2013. IEEE (2012)

    Google Scholar 

  23. Milgram, P., Takemura, H., Utsumi, A., Kishino, F.: Augmented reality: a class of displays on the reality-virtuality continuum. Proc. SPIE 2351, 282–292 (1995)

    Article  Google Scholar 

  24. Holz, T., Dragone, M., O’Hare, G.: Where robots and virtual agents meet. Int. J.Soc. Robot. 1(1), 83–93 (2009)

    Article  Google Scholar 

  25. Herder, J., Cohen, M.: Enhancing perspicuity of objects in virtual reality environments. In: CT’97 – Second International Cognitive Technology Conference on IEEE, pp. 228–237. IEEE Press, August 1997. ISBN 0-8186-8084-9

    Google Scholar 

  26. Vierjahn, T., Wöldecke, B., Geiger, C., Herder, J.: Improved direction signalization technique employing vibrotactile feedback. In: 11th Virtual Reality International Conference, VRIC’2009 (2009)

    Google Scholar 

  27. Wöldecke, B., Vierjahn, T., Flasko, M., Herder, J., Geiger, C.: Steering actors through a virtual set employing vibro-tactile feedback. In: TEI 2009: Proceedings of the 3rd International Conference on Tangible and Embedded Interaction, pp. 169–174. ACM, New York (2009)

    Google Scholar 

  28. Ludwig, P., Büchel, J., Herder, J., Vonolfen, W.: InEarGuide - a navigation and interaction feedback system using in ear headphones for virtual tv studio productions. In: 9. Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, Düsseldorf (2012)

    Google Scholar 

Download references

Acknowledgments

The authors thank Jose Burga, Sascha Charlie Djuderija, Maren Gnehr, Sven Hartz, Mohammed Ibrahim, Nikolas Koop, Laurid Meyer, Antje Müller, Björn Salgert, Richard Schroeder, and Simon Thiele who helped to implement the “VRON” example production. The music was composed by Lars Goossens. Christophe Leske contributed as actor. Christoph Postertz, Tobias Mönninger, and Julian Thiede run the example production with the green touch screen in Fig. 8. Some work was carried out within the “IVO [at] hiTV - Interaction with virtual objects in iTV productions” project, supported by the “FHprofUnt” program of the Federal Ministry of Education and Research (BMBF), Germany (grant no. 17010X10).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jens Herder .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 140438 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Herder, J., Daemen, J., Haufs-Brusberg, P., Abdel Aziz, I. (2015). Four Metamorphosis States in a Distributed Virtual (TV) Studio: Human, Cyborg, Avatar, and Bot – Markerless Tracking and Feedback for Realtime Animation Control. In: Brunnett, G., Coquillart, S., van Liere, R., Welch, G., Váša, L. (eds) Virtual Realities. Lecture Notes in Computer Science(), vol 8844. Springer, Cham. https://doi.org/10.1007/978-3-319-17043-5_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-17043-5_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-17042-8

  • Online ISBN: 978-3-319-17043-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics