skip to main content
10.1145/3206505.3206561acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
poster

Spatial statistics for analyzing data in cinematic virtual reality

Published: 29 May 2018 Publication History

Abstract

Cinematic Virtual Reality has been increasing in popularity over the last years. Watching 360° movies with head mounted displays, viewers can freely choose the direction of view, and thus the visible section of the movie. In order to explore the viewers' behavior, methods are needed for collecting and analyzing data. In our experiments we compare the viewing behavior for movies with spatial and non-spatial sound and tracked the head movements of the participants. This work-in-progress describes two approaches of spatial statistics - analysis of Space Time Cubes and Getis Ord Gi* statistic - for analyzing head tracking data.

References

[1]
Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., and Ertl, T. Visualization of Eye Tracking Data: A Taxonomy and Survey. In Computer Graphics Forum (2017).
[2]
Cho, Nahye and Kang, Youngok. Space-time density of field trip trajectory: exploring spatio-temporal patterns in movement data. Spatial Information Research, 25 (2017), 141-150.
[3]
De Abreu, Ana, Ozcinar, Cagri, and Smolic, Aljosa. Look around you: Saliency maps for omnidirectional images in vr applications. In Quality of Multimedia Experience (QoMEX), 2017 Ninth International Conference on (2017), 1-6.
[4]
Demšar, Urška and Virrantaus, Kirsi. Space-time density of trajectories: exploring spatio-temporal patterns in movement data. International Journal of Geographical Information Science, 24 (2010), 1527-1542.
[5]
Ding, Linfang, Krisp, Jukka M., Meng, L., Xiao, G., and Keler, A. Visual exploration of multivariate movement events in space-time cube. AGILE 2016 (2016).
[6]
Getis, Arthur and Ord, J. Keith. The analysis of spatial association by use of distance statistics. Geographical analysis, 24 (1992), 189-206.
[7]
Guo, Diansheng, Chen, Jin, MacEachren, Alan M., and Liao, Ke. A visualization system for space-time and multivariate patterns (vis-stamp). IEEE transactions on visualization and computer graphics, 12 (2006), 1461-1474.
[8]
Hägerstraand, Torsten. What about people in regional science? Papers in regional science, 24 (1970), 7-24.
[9]
Kraak, Menno-Jan. The space-time cube revisited from a geovisualization perspective. In Proc. 21st International Cartographic Conference (2003), 1988-1996.
[10]
Kristensson, Per Ola, Dahlback, Nils, Anundi, Daniel et al. An evaluation of space time cube representation of spatiotemporal patterns. IEEE Transactions on Visualization and Computer Graphics, 15 (2009), 696-702.
[11]
Li, Xia, Çöltekin, Arzu, and Kraak, Menno-Jan. Visual exploration of eye movement data using the space-time-cube. In International Conference on Geographic Information Science (2010), 295-309.
[12]
Nielsen, Lasse T., Møller, Matias B., Hartmeyer, Sune D., Ljung, Troels, Nilsson, Niels C., Nordahl, Rolf, and Serafin, Stefania. Missing the point: an exploration of how to guide users attention during cinematic virtual reality. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology (2016), 229-232.
[13]
Ord, J. Keith and Getis, Arthur. Local spatial autocorrelation statistics: distributional issues and an application. Geographical analysis, 27 (1995), 286-306.
[14]
Rai, Yashas, Gutiérrez, Jesús, and Le Callet, Patrick. A dataset of head and eye movements for 360 degree images. In Proceedings of the 8th ACM on Multimedia Systems Conference (2017), 205-210.
[15]
Songchitruksa, Praprut and Zeng, Xiaosi. Getis-Ord spatial statistics to identify hot spots by using incident management data. Transportation Research Record: Journal of the Transportation Research Board (2010), 42-51.
[16]
Upenik, Evgeniy and Ebrahimi, Touradj. A simple method to obtain visual attention data in head mounted virtual reality. In IEEE International Conference on Multimedia and Expo 2017 (2017).

Cited By

View all
  • (2021)Dynamic Field of View Restriction in 360° Video: Aligning Optical Flow and Visual SLAM to Mitigate VIMSProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445499(1-18)Online publication date: 6-May-2021
  • (2020)Staying on Track: a Comparative Study on the Use of Optical Flow in 360° Video to Mitigate VIMSProceedings of the 2020 ACM International Conference on Interactive Media Experiences10.1145/3391614.3393658(82-93)Online publication date: 17-Jun-2020
  • (2020)Exploring the impact of 360° movie cuts in users’ attention2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.1580727911717(73-82)Online publication date: Mar-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
AVI '18: Proceedings of the 2018 International Conference on Advanced Visual Interfaces
May 2018
430 pages
ISBN:9781450356169
DOI:10.1145/3206505
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 May 2018

Check for updates

Author Tags

  1. 360° movie
  2. cinematic virtual reality
  3. getis ord gi*
  4. space time cube
  5. spatial sound
  6. spatial statistics

Qualifiers

  • Poster

Conference

AVI '18
AVI '18: 2018 International Conference on Advanced Visual Interfaces
May 29 - June 1, 2018
Grosseto, Castiglione della Pescaia, Italy

Acceptance Rates

AVI '18 Paper Acceptance Rate 19 of 77 submissions, 25%;
Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)1
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2021)Dynamic Field of View Restriction in 360° Video: Aligning Optical Flow and Visual SLAM to Mitigate VIMSProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445499(1-18)Online publication date: 6-May-2021
  • (2020)Staying on Track: a Comparative Study on the Use of Optical Flow in 360° Video to Mitigate VIMSProceedings of the 2020 ACM International Conference on Interactive Media Experiences10.1145/3391614.3393658(82-93)Online publication date: 17-Jun-2020
  • (2020)Exploring the impact of 360° movie cuts in users’ attention2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.1580727911717(73-82)Online publication date: Mar-2020
  • (2020)Exploring the impact of 360° movie cuts in users’ attention2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.00025(73-82)Online publication date: Mar-2020
  • (2019)Exploring Visual Guidance in 360-degree VideosProceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video10.1145/3317697.3323350(1-12)Online publication date: 4-Jun-2019
  • (2019)"When the Elephant Trumps"Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300925(1-13)Online publication date: 2-May-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media