skip to main content
10.1145/2993369.2993389acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Interactive motion effects design for a moving object in 4D films

Published: 02 November 2016 Publication History

Abstract

This paper presents an algorithm that allows for the rapid design of motion effects for 4D films. Our algorithm is based on a viewer-centered rendering strategy that matches chair motion to the movement of a viewer's visual attention. Object tracking algorithm is used to estimate the movement of visual attention under the assumption that visual attention follows an object of interest. We performed several experiments to find optimal parameters for implementation, such as the required accuracy of object tracking. Our algorithm enables motion effects design to be at least 10 times faster than the current practice of manual authoring. We also assessed the subjective quality of the motion effects generated by our algorithm, and results indicated that our algorithm can provide perceptually plausible motion effects.

Supplementary Material

MP4 File (p219-lee.mp4)

References

[1]
Biocca, F., and Delaney, B. 1995. Immersive virtual reality technology. In Communication in the Age of Virtual Reality. Routledge, ch. 4, 57--126.
[2]
Choi, S., and Kuchenbecker, K. J. 2013. Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE 101, 9, 2093--2104.
[3]
Danelljan, M., Khan, F. S., Felsberg, M., and van de Weijer, J. 2014. Adaptive color attributes for real-time visual tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1090--1097.
[4]
Danieau, F., Fleureau, J., Guillotel, P., Mollet, N., Christie, M., and Lecuyer, A. 2012. Hapseat: Producing motion sensation with multiple force-feedback devices embedded in a seat. In Proceedings of ACM Symposium on Virtual Reality Software and Technology (VRST), 69--76.
[5]
Danieau, F., Fleureau, J., Guillotel, P., Mollet, N., Christie, M., and Lecuyer, A. 2013. Toward haptic cinematography: Enhancing movie experience with haptic effects based on cinematographic camera motions. IEEE MultiMedia 21, 2, 11--21.
[6]
Hare, S., Saffari, A., and Torr, P. H. S. 2011. Struck: Structured output tracking with kernels. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), 263--270.
[7]
Hartley, R., and Zisserman, A. 2004. Two-view geometry. In Multiple View Geometry in Computer Vision, second edition ed. Cambridge University Press, 237--361.
[8]
Hirota, K., Ebisawa, S., Amemiya, T., and Ikei, Y. 2011. A system for creating the content for a multi-sensory theater. In Proceedings of the Virtual and Mixed Reality (Held as Part of HCI International), 151--157.
[9]
Hwang, I., and Choi, S. 2014. Improved haptic music player with auditory saliency estimation. In Lecture Notes on Computer Science (Eurohaptics), vol. LNCS 8618, 232--240.
[10]
Hwang, I., Lee, H., and Choi, S. 2013. Real-time dual-band haptic music player for mobile devices. IEEE Transactions on Haptics 6, 3, 340--351.
[11]
Israr, A., and Poupyrev, I. 2011. Tactile brush: Drawing on skin with a tactile grid display. In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), 2019--2028.
[12]
Kim, J., Lee, C.-G., Kim, Y., and Ryu, J. 2013. Construction of a haptic-enabled broadcasting system based on the MPEG-V standard. Signal Processing: Image Communication 28, 2, 151--161.
[13]
Kim, M., Lee, S., and Choi, S. 2014. Saliency-driven real-time video-to-tactile translation. IEEE Transactions on Haptics 7, 3, 394--404.
[14]
Krull, A., Michel, F., Brachmann, E., Gumhold, S., Ihrke, S., and Rother, C. 2015. 6-dof model based tracking via object coordinate regression. In Proceedings of European Conference on Computer Vision (ECCV), 384--399.
[15]
Lee, J., Han, B., and Choi, S. 2016. Motion effects synthesis for 4d films. IEEE Transactions on Visualization and Computer Graphics (Early Access).
[16]
Matsukura, H., Nihei, T., and Ishida, H. 2011. Multi-sensorial field display: Presenting spatial distribution of airflow and odor. In Proceedings of the IEEE Virtual Reality.
[17]
Matsukura, H., Yoneda, T., and Ishida, H. 2013. Smelling screen: Development and evaluation of an olfactory display system for presenting a virtual odor source. IEEE Transactions on Visualization and Computer Graphics 19, 4, 606--615.
[18]
Moon, T., and Kim, G. J. 2004. Design and evaluation of a wind display for virtual reality. In Proceedings of ACM Symposium on Virtual Reality Software and Technolog (VRST), 122--128.
[19]
Nahon, M. A., and Reid, L. D. 1990. Simulator motion-drive algorithms: A designer's perspective. Journal of Guidance, Control, and Dynamics 13, 2, 356--362.
[20]
Nakamoto, T., and Minh, H. P. D. 2007. Improvement of olfactory display using solenoid valves. In Proceedings of IEEE Virtual Reality (VR), 179--186.
[21]
Nam, H., and Han, B. 2015. Learning multi-domain convolutional neural networks for visual tracking. CoRR abs/1510.07945.
[22]
Pauwels, K., Rubio, L., Diaz, J., and Ros, E. 2013. Real-time model-based rigid object pose estimation and tracking combining dense and sparse visual cues. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2347--2354.
[23]
Reid, L. D., and Nahon, M. A. 1985. Flight simulation motion-base drive algorithms: Part1 - developing and testing equations. Tech. rep., Institude for Aerospace Studies, University of Toronto.
[24]
Reymond, G., and Kemeny, A. 2000. Motion cueing in the renault driving simulator. Vehicle System Dynamics 34, 4, 249--259.
[25]
Shin, S., Yoo, B., and Han, S. 2013. A framework for automatic creation of motion effects from theatrical motion pictures. Multimedia Systems 20, 327--346.
[26]
von der Heyde, M., and Riecke, B. E. 2001. How to cheat in motion simulation - comparing the engineering and fun ride approach to motion cueing. Tech. Rep. 089, Max-Planck-Institut.
[27]
Waltl, M., Rainer, B., Timmerer, C., and Hellwagner, H. 2013. An end-to-end tool chain for sensory experience based on MPEG-V. Signal Processing: Image Communication 28, 2, 136--150.
[28]
Whitehead, G. 2001. The technological art of simulation. SMPTE Motion Imaging 110, 39--42.
[29]
Wu, Y., Lim, J., and Yang, M.-H. 2013. Online object tracking: A benchmark. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2411--2418.
[30]
Yoo, B., Cha, M., and Han, S. 2005. A framework for a multi-sensory VR effect system with motional display. In Proceedings of the IEEE International Conference on Cyberworlds, 1--8.
[31]
Zhang, J., Ma, S., and Sclaroff, S. 2014. Meem: Robust tracking via multiple experts using entropy minimization. In Proceedings of European Conference on Computer Vision (ECCV), 188--203.
[32]
Zhong, W., Lu, H., and Yang, M.-H. 2012. Robust object tracking via sparsity-based collaborative model. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1838--1845.

Cited By

View all
  • (2024)Video2Haptics: Converting Video Motion to Dynamic Haptic Feedback with Bio-Inspired Event ProcessingIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.336046830:12(7717-7735)Online publication date: Dec-2024
  • (2024)Telemetry-Based Haptic Rendering for Racing Game Experience ImprovementIEEE Transactions on Haptics10.1109/TOH.2024.335788517:1(72-79)Online publication date: Jan-2024
  • (2023)Semi-automatic mulsemedia authoring analysis from the user's perspectiveProceedings of the 14th Conference on ACM Multimedia Systems10.1145/3587819.3590979(249-256)Online publication date: 7-Jun-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '16: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology
November 2016
363 pages
ISBN:9781450344913
DOI:10.1145/2993369
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 4D film
  2. motion effects
  3. multi-sensory theater

Qualifiers

  • Research-article

Conference

VRST '16

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)53
  • Downloads (Last 6 weeks)2
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Video2Haptics: Converting Video Motion to Dynamic Haptic Feedback with Bio-Inspired Event ProcessingIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.336046830:12(7717-7735)Online publication date: Dec-2024
  • (2024)Telemetry-Based Haptic Rendering for Racing Game Experience ImprovementIEEE Transactions on Haptics10.1109/TOH.2024.335788517:1(72-79)Online publication date: Jan-2024
  • (2023)Semi-automatic mulsemedia authoring analysis from the user's perspectiveProceedings of the 14th Conference on ACM Multimedia Systems10.1145/3587819.3590979(249-256)Online publication date: 7-Jun-2023
  • (2023)Generating Haptic Motion Effects for Multiple Articulated Bodies for Improved 4D Experiences: A Camera Space ApproachProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580727(1-17)Online publication date: 19-Apr-2023
  • (2023)Merging Camera and Object Haptic Motion Effects for Improved 4D Experiences2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00120(1036-1044)Online publication date: 16-Oct-2023
  • (2023)Sensory cue integration of visual and vestibular stimuli: a case study for 4D ridesVirtual Reality10.1007/s10055-023-00762-727:3(1671-1683)Online publication date: 9-Feb-2023
  • (2022)Motion Effects: Perceptual Space and Synthesis for Specific Perceptual PropertiesIEEE Transactions on Haptics10.1109/TOH.2022.319695015:3(626-637)Online publication date: 1-Jul-2022
  • (2022)Data-Driven Rendering of Motion Effects for Walking Sensations in Different GaitsIEEE Transactions on Haptics10.1109/TOH.2022.317696415:3(547-559)Online publication date: 1-Jul-2022
  • (2021)Absolute and Differential Thresholds of Motion Effects in Cardinal DirectionsProceedings of the 27th ACM Symposium on Virtual Reality Software and Technology10.1145/3489849.3489870(1-10)Online publication date: 8-Dec-2021
  • (2021)Image-Based Texture Styling for Motion Effect RenderingProceedings of the 27th ACM Symposium on Virtual Reality Software and Technology10.1145/3489849.3489854(1-10)Online publication date: 8-Dec-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media