Skip to main content

Realtime Dynamic Multimedia Storyline Based on Online Audience Biometric Information

  • Chapter
New Directions in Intelligent Interactive Multimedia

Part of the book series: Studies in Computational Intelligence ((SCI,volume 142))

Abstract

Audience complete action immersion sensation is still the ultimate goal of the multimedia industry. In spite of the significant technical audiovisual advances that enable more realistic contents, coping with individual audience needs and desires is still an incomplete achievement. The proposed project’s intention is to contribute for solving this issue through enabling real-time dynamic multimedia storylines with emotional subconscious audience interaction. Individual emotional state assessment is accomplished by direct access to online biometric information. Recent technologic breakthroughs have enabled the usage of minimal invasive biometric hardware devices that no longer interfere with the audience immersion feeling. Other key module of the project is the conceptualization of a dynamic storyline multimedia content system with emotional metadata, responsible for enabling discrete or continuous storyline route options. The unifying component is the definition of the full-duplex communication protocol. The current stage of research has already produced a spin-off product capable of providing computer mouse control through electromyography and has identified key factors in human emotions through experiments conducted in the developed system’s architecture that have enabled semi-automatic emotion assessment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Marston, W.M.: Systolic Blood Pressure Changes in Deception. Journal of Experimental Psychology 2, 117–163 (1917)

    Article  Google Scholar 

  2. Pravdich-Neminsky, V.V.: Ein Versuch der Registrierung der elektrischen Gehirnerscheinungen. Zbl Physiol. 27, 951–960 (1913)

    Google Scholar 

  3. Ebersole, J.S.: Current Practice of Clinical Electroencephalography. Lippincott Williams & Wilkins (2002) ISBN 0-7817-1694-2

    Google Scholar 

  4. De Pascalis, V., Ray, W.J., Tranquillo, I., D’Amico, D.: EEG activity and heart rate during recall of emotional events in hypnosis: relationships with hypnotizability and suggestibility. International Journal of Psychophysiology 29(3), 255–275 (1998)

    Article  Google Scholar 

  5. Aftanas, L.I., Lotova, N.V., Koshkarov, V.I., Popov, S.A., Makhnev, V.P.: Nonlinear Forecasting Measurements of the Human EEG During Evoked Emotions. Brain Topography 10(2), 155–162 (1997)

    Article  Google Scholar 

  6. Ishino, K., Hagiwara, M.: A Feeling Estimation System Using a Simple Electroencephalograph. In: Proceedings of 2003 IEEE International Conference on Systems, Man, and Cybernetics, pp. 4204–4209 (2003)

    Google Scholar 

  7. Takahashi, K.: Remarks on Emotion Recognition from Bio-Potencial Signals. In: The second International Conference on Autonomous Robots and Agents (December 2004)

    Google Scholar 

  8. Chanel, G., Kronegg, J., Grandjean, D., Pun, T.: Emotion Assessment: Arousal Evalutation Using EEG’s and Peripheral Physiological Signals. Technical Report (December 2005)

    Google Scholar 

  9. Konrad, R.: Associated Press: Next-Generation Toys Read Brain Waves, http://www.usatoday.com/tech/products/games/2007-04-29-mindreadingtoys_N.htm (Consulted June 2007)

  10. Various Authors: NeuroSky, Do You ”Mind”? Fact Sheet. Technical Report (January 2007)

    Google Scholar 

  11. Johnson, M.: Branchez-Vous.com: Controler un train electrique avec son cerveau, http://techno.branchez-vous.com/actualite/2007/06/controleruntrainelectrique.html (Consulted October, 2007)

  12. Kemp, B., Värri, A., Rosa, A.C., Nielsen, K.D., Gade, J.: A Simple Format For Exchange of Digitized Polygraphic Recordings. Electroencephalography and Clinical Neurophysiology 82, 391–393 (1992)

    Article  Google Scholar 

  13. Bailer, W., Schallauer, P.: The Detailed Audiovisual Profile: Enabling Interoperability between MPEG-7 Based Systems. In: Proceedings of 12th International Multi-Media Modeling Conference, Beijing, CN (2006)

    Google Scholar 

  14. Troncy, R.: Integrating Structure and Semantics into Audio-visual Documents. In: Fensel, D., Sycara, K.P., Mylopoulos, J. (eds.) ISWC 2003. LNCS, vol. 2870, pp. 566–581. Springer, Heidelberg (2003)

    Google Scholar 

  15. Bloehdorn, S., Petridis, K., Saathoff, S.C.N., Tzouvaras, V., Avrithis, Y., Handschuh, S., Kompatsiaris, I., Staab, S., Strintzis, M.G.: Semantic Annotation of Images and Videos for Multimedia Analysis. In: Gómez-Pérez, A., Euzenat, J. (eds.) ESWC 2005. LNCS, vol. 3532, Springer, Heidelberg (2005)

    Google Scholar 

  16. Hunter, J.: Adding Multimedia to the Semantic Web - Building an MPEG-7 Ontology. In: Proceedings of the 1st International Semantic Web Working Symposium (SWWS 2001), Stanford, USA (July 30- August 2001)

    Google Scholar 

  17. Isaac, A., Troncy, R.: Designing and Using an Audio-Visual Description Core Ontology. In: Motta, E., Shadbolt, N.R., Stutt, A., Gibbins, N. (eds.) EKAW 2004. LNCS (LNAI), vol. 3257, Springer, Heidelberg (2004)

    Google Scholar 

  18. Money, A.G., Agius, H.: Automating the Extraction of Emotion-Related Multimedia Semantics. In: IEEE International Workshop on Human-Computer Interaction, China (2005)

    Google Scholar 

  19. MIT Interactive Cinema Group (June 2007), http://ic.media.mit.edu/

  20. Façade a one-act interactive drama: http://www.interactivestory.net/ (Consulted June 2007)

  21. Casella, P., Paiva, A.: MAgentA: An Architecture for Real Time Automatic Composition of Background Music. In: Proceedings of the Third International Workshop on Intelligent Virtual Agents, pp. 224–232 (2001)

    Google Scholar 

  22. Vinhas, A.G.V.: Mouse control through electromyography: Using biosignals towards new user interface paradigms. In: Biosignals, vol. 33 (2008)

    Google Scholar 

  23. Teixeira, J., Vinhas, V., Oliveira, E., Reis, L.P.: General-Purpose Emotion Assessment Testbed Based on Biometric Information. In: Review Process in KES Intelligent Interactive Multimedia Systems and Services (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

George A. Tsihrintzis Maria Virvou Robert J. Howlett Lakhmi C. Jain

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Vinhas, V., Oliveira, E., Reis, L.P. (2008). Realtime Dynamic Multimedia Storyline Based on Online Audience Biometric Information. In: Tsihrintzis, G.A., Virvou, M., Howlett, R.J., Jain, L.C. (eds) New Directions in Intelligent Interactive Multimedia. Studies in Computational Intelligence, vol 142. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-68127-4_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-68127-4_56

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-68126-7

  • Online ISBN: 978-3-540-68127-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics