skip to main content
10.1145/3210825.3213556acmconferencesArticle/Chapter ViewAbstractPublication PagesimxConference Proceedingsconference-collections
Work in Progress

Dynamic Subtitles in Cinematic Virtual Reality

Published:25 June 2018Publication History

ABSTRACT

Cinematic Virtual Reality has been increasing in popularity in recent years. Watching 360° movies with a Head Mounted Display, the viewer can freely choose the direction of view, and thus the visible section of the movie. Therefore, a new approach for the placements of subtitles is needed. There are three main issues which have to be considered: the position of the subtitles, the speaker identification and the influence for the VR experience. In our study we compared a static method, where the subtitles are placed at the bottom of the field of view, with dynamic subtitles 1, where the position of the subtitles depends on the scene and is close to the speaking person. This work-in-progress describes first results of the study which point out that dynamic subtitles can lead to a higher score of presence, less sickness and lower workload.

References

  1. M Brooks and M Armstrong. 2014. Enhancing Subtitles. TVX2014 Conference, Brussels (2014), 25--27.Google ScholarGoogle Scholar
  2. Andy Brown, Rhia Jones, Mike Crabb, James Sandford, Matthew Brooks, Mike Armstrong, and Caroline Jay. 2015. Dynamic subtitles: the user experience. In Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video. ACM, 103--112. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Andy Brown, Jayson Turner, Jake Patterson, Anastasia Schmitz, Mike Armstrong, and Maxine Glancy. 2017. Subtitles in 360-degree Video. In Adjunct Publication of the 2017 ACM International Conference on Interactive Experiences for TV and Online Video. ACM, 3--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology 52 (1988), 139--183.Google ScholarGoogle Scholar
  5. Yongtao Hu, Jan Kautz, Yizhou Yu, and Wenping Wang. 2015. Speaker-following video subtitles. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 11, 2 (2015), 32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Robert S Kennedy, Norman E Lane, Kevin S Berbaum, and Michael G Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology 3, 3 (1993), 203--220.Google ScholarGoogle Scholar
  7. Kuno Kurzhals, Emine Cetinkaya, Yongtao Hu, Wenping Wang, and Daniel Weiskopf. 2017. Close to the Action: Eye-Tracking Evaluation of Speaker-Following Subtitles. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 6559--6568. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Quoc V Vy and Deborah I Fels. 2010. Using placement and name for speaker identification in captioning. In International Conference on Computers for Handicapped Persons. Springer, 247--254. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Bob G Witmer and Michael J Singer. 1998. Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments 7, 3 (1998), 225--240. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Dynamic Subtitles in Cinematic Virtual Reality

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader