skip to main content
10.1145/3489849.3489854acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Image-Based Texture Styling for Motion Effect Rendering

Published: 08 December 2021 Publication History

Abstract

A motion platform provides the vestibular stimuli that elicit the sensations of self-motion and thereby improves the immersiveness. A representative example is a 4D Ride, which presents a video of POV shots and motion effects synchronized with the camera motion in the video. Previous research efforts resulted in a few automatic motion effect synthesis algorithms for POV shots. Although effective in generating gross motion effects, they do not consider fine features on the ground, such as a rough or bumpy road. In this paper, we propose an algorithm for styling the gross motion effects using a texture image. Our algorithm transforms a texture image into a high-frequency style motion and merges it with the original motion while respecting both perceptual and device constraints. A user study demonstrated that texture styling could increase immersiveness, realism, and harmony.

Supplementary Material

Supplemental file (Supplementary_Material__VRST_2021__Texture_Styling.pdf)

References

[1]
Raphael Abreu, Douglas Mattos, Joel A.F. dos Santos, and Débora C. Muchaluat-Saade. 2019. Semi-automatic synchronization of sensory effects in mulsemedia authoring tools. In Proceedings of the 25th Brazillian Symposium on Multimedia and the Web. ACM, New York, NY, USA, 201–208.
[2]
Houshyar Asadi, Shady Mohamed, Chee Peng Lim, Saeid Nahavandi, and Eugene Nalivaiko. 2017. Semicircular canal modeling in human perception. Reviews in the Neurosciences 28, 5 (2017), 537–549.
[3]
Eduardo F. Camacho and Carlos Bordons Alba. 2013. Model predictive control. Springer, London.
[4]
Yeong Shiong. Chiew, M.Kasim Abdul Jalil, and Mohamed Hussein. 2008. Kinematic modeling of driving simulator motion platform. In 2008 IEEE Conference on Innovative Technologies in Intelligent Systems and Industrial Applications. IEEE, Manhattan, New York, U.S., 30–34.
[5]
Seungmoon Choi and Katherine J. Kuchenbecker. 2012. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 101, 9 (2012), 2093–2104.
[6]
Brant Clark and John D. Stewart. 1972. The power law for the perception of rotation by airline pilots. Perception & Psychophysics 11, 6 (1972), 433–436.
[7]
Diane Cleij, Joost Venrooij, Paolo Pretto, Mikhail Katliar, Heinrich H. Bülthoff, Dennis Steffen, Friedrich W. Hoffmeyer, and H-P Schöner. 2019. Comparison between filter-and optimization-based motion cueing algorithms for driving simulation. Transportation Research Part F: Traffic Psychology and Behaviour 61 (2019), 53–68.
[8]
F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris, L. Sentis, J. Warren, O. Khatib, and K. Salisbury. 2003. The CHAI libraries. In Proceedings of Eurohaptics 2003. Dublin, Ireland, 496–500.
[9]
Raphael Silva de Abreu, Douglas Mattos, Joel dos Santos, Gheorghita Ghinea, and Débora Christina Muchaluat-Saade. 2020. Toward content-driven intelligent authoring of mulsemedia applications. IEEE MultiMedia 28, 1 (2020), 7–16.
[10]
Wiluam Elsnert. 1971. Power laws for the perception of rotation and the oculogyral illusion. Perception & Psychophysics 9, 5 (1971), 418–420.
[11]
Ravi Garg, Vijay Kumar Bg, Gustavo Carneiro, and Ian Reid. 2016. Unsupervised cnn for single view depth estimation: Geometry to the rescue. In Proceedings of European Conference on Computer Vision. Springer, Cham, 740–756.
[12]
Geralt. [n.d.]. Topping Road Sidewalk Texture. https://www.freeimg.net/photo/152354/topping-road-sidewalk-texture [Online; accessed July 6, 2021].
[13]
George A Gescheider. 2013. Psychophysics: the Fundamentals. Psychology Press, Hove and New York.
[14]
Eunsu Goh, Daeyeol Kim, Suyeong Oh, and Chae-Bong Sohn. 2020. Automatic effect generation method for 4D films. International Journal of Computing and Digital Systems 9, 2(2020), 281–288.
[15]
Sangyoon Han, Gyeore Yun, and Seungmoon Choi. 2021. Camera Space Synthesis of Motion Effects Emphasizing a Moving Object in 4D films. In 2021 IEEE Virtual Reality and 3D User Interfaces. IEEE, Manhattan, New York, U.S., 670–678.
[16]
Sung H. Han, Maengkee Song, and Jiyoung Kwahk. 1999. A systematic method for analyzing magnitude estimation data. International Journal of Industrial Ergonomics 23, 5-6(1999), 513–524.
[17]
Hossein Hassani. 2007. Singular spectrum analysis: methodology and comparison. Journal of Data Science 5 (2007), 239–257.
[18]
Wijnand A IJsselsteijn, Yvonne AW de Kort, and Karolien Poels. 2013. The game experience questionnaire. Eindhoven: Technische Universiteit Eindhoven 46, 1 (2013).
[19]
Ali Israr and Ivan Poupyrev. 2011. Tactile brush: drawing on skin with a tactile grid display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 2019–2028.
[20]
Jill J Jenkins. [n.d.]. Bricks Wall Brick Wall Stone Wall. https://www.freeimg.net/photo/195278/bricks-wall-brickwall-stonewall [Online; accessed July 6, 2021].
[21]
Myongchan Kim, Sungkil Lee, and Seungmoon Choi. 2013. Saliency-driven real-time video-to-tactile translation. IEEE Transactions on Haptics 7, 3 (2013), 394–404.
[22]
Sang-Kyun Kim. 2013. Authoring multisensorial content. Signal Processing: Image Communication 28, 2 (2013), 162–167.
[23]
Yeongmi Kim, Jongeun Cha, Jeha Ryu, and Ian Oakley. 2010. A tactile glove design and authoring system for immersive multimedia. IEEE MultiMedia 17, 3 (2010), 34–45.
[24]
Nils Landin, Joseph M. Romano, William McMahan, and Katherine J. Kuchenbecker. 2010. Dimensional reduction of high-frequency accelerations for haptic rendering. In International conference on human haptic sensing and touch enabled computer applications. Springer, Berlin, Heidelberg, 79–86.
[25]
Jaebong Lee, Bohyung Han, and Seungmoon Choi. 2015. Motion effects synthesis for 4D films. IEEE Transactions on Visualization and Computer Graphics 22, 10(2015), 2300–2314.
[26]
Jaebong Lee, Bohyung Han, and Seungmoon Choi. 2016. Interactive motion effects design for a moving object in 4D films. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. ACM, New York, NY, USA, 219–228.
[27]
JJ-W Lin, Henry Been-Lirn Duh, Donald E Parker, Habib Abi-Rached, and Thomas A Furness. 2002. Effects of field of view on presence, enjoyment, memory, and simulator sickness in a virtual environment. In Proceedings ieee virtual reality 2002. IEEE, 164–171.
[28]
Martin Vorel. 2017. Train tracks top view. https://libreshot.com/train-tracks-top-view/ [Online; accessed July 6, 2021].
[29]
Douglas Paulo de Mattos and Débora Christina Muchaluat Saade. 2016. STEVE: spatial-temporal view editor for authoring hypermedia documents. In Proceedings of the 22nd Brazilian Symposium on Multimedia and the Web. ACM, New York, NY, USA, 63–70.
[30]
Taeyong Moon and Gerard J Kim. 2004. Design and evaluation of a wind display for virtual reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM, New York, NY, USA, 122–128.
[31]
Meyer A. Nahon and Lloyd D. Reid. 1990. Simulator motion-drive algorithms-A designer’s perspective. Journal of Guidance, Control, and Dynamics 13, 2 (1990), 356–362.
[32]
Takamichi Nakamoto and Hai Pham Dinh Minh. 2007. Improvement of olfactory display using solenoid valves. In 2007 IEEE Virtual Reality Conference. IEEE, Manhattan, New York, U.S., 179–186.
[33]
Douglas O’Shaughnessy. 1988. Linear predictive coding. IEEE Potentials 7, 1 (1988), 29–32.
[34]
Russell V. Parrish, James E. Dieudonne, Roland L. Bowles, and Dennis J. Martin Jr. 1975. Coordinated adaptive washout for motion simulators. Journal of Aircraft 12, 1 (1975), 44–50.
[35]
Richard D. Parsons. 1970. Magnitude estimates of the oculogyral illusion during and following angular acceleration.Journal of Experimental Psychology 84, 2 (1970), 230.
[36]
Lloyd D. Reid and Meyer A. Nahon. 1985. Flight simulation motion-base drive algorithms: part 1. developing and testing equations. UTIAS Report, No. 296(1985).
[37]
Joseph M. Romano, Takashi Yoshioka, and Katherine J. Kuchenbecker. 2010. Automatic filter design for synthesis of haptic textures from recorded acceleration data. In 2010 IEEE International Conference on Robotics and Automation. IEEE, Manhattan, New York, U.S., 1815–1821.
[38]
Valentin Schwind, Pascal Knierim, Nico Haas, and Niels Henze. 2019. Using presence questionnaires in virtual reality. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–12.
[39]
Jongman Seo, Sunung Mun, Jaebong Lee, and Seungmoon Choi. 2018. Substituting motion effects with vibrotactile effects for 4D experiences. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–6.
[40]
Sunghwan Shin and Seungmoon Choi. 2020. Hybrid framework for haptic texture modeling and rendering. IEEE Access 8(2020), 149825–149840.
[41]
Suchul Shin, Byounghyun Yoo, and Soonhung Han. 2014. A framework for automatic creation of motion effects from theatrical motion pictures. Multimedia Systems 20, 3 (2014), 327–346.
[42]
Anastasiia Skoryk, Yurii Chyrka, Ievgen Gorovyi, Oleksiy Grechnyev, and Pavlo Vyplavin. 2020. Comparative analysis of classic computer vision methods and deep convolutional neural networks for floor segmentation. In 2020 IEEE Third International Conference on Data Stream Mining & Processing (DSMP). IEEE, Manhattan, New York, U.S., 217–221.
[43]
Robert J. Telban and Frank M. Cardullo. 2005. Motion cueing algorithm development: Human-centered linear and nonlinear approaches. Technical Report. NASA.
[44]
Wouter M. Bergmann Tiest and Astrid M.L. Kappers. 2007. Haptic and visual perception of roughness. Acta Psychologica 124, 2 (2007), 177–189.
[45]
Joost Venrooij, Paolo Pretto, Mikhail Katliar, Suzanne A.E. Nooij, Alessandro Nesti, Maria Lächele, Ksander N. de Winkel, Diane Cleij, and Heinrich H. Bülthoff. 2015. Perception-based motion cueing: validation in driving simulation. In DSC 2015 Europe: Driving Simulation Conference Exhibition. Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 153–161.
[46]
Markus Waltl, Benjamin Rainer, Christian Timmerer, and Hermann Hellwagner. 2013. An end-to-end tool chain for sensory experience based on MPEG-V. Signal Processing: Image Communication 28, 2 (2013), 136–150.
[47]
Evan James Williams. 1949. Experimental designs balanced for the estimation of residual effects of treatments. Australian Journal of Chemistry 2, 2 (1949), 149–168.
[48]
Gyeore Yun, Hyoseung Lee, Sangyoon Han, and Seungmoon Choi. 2021. Improving viewing experiences of first-person shooter gameplays with automatically-generated motion effects. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–14.
[49]
Larisa Evgenievna Zaichik, Viktor Viktorovich Rodchenko, Igor. Rufov, Yuri Petrovich Yashin, and Alan. White. 1999. Acceleration perception. In Modeling and Simulation Technologies Conference and Exhibit. American Institute of Aeronautics and Astronautics, Reston, Virginia, U.S., 4334.
[50]
Yuhao Zhou, Makarand Tapaswi, and Sanja Fidler. 2018. Now you shake me: Towards automatic 4D cinema. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE, Manhattan, New York, U.S., 7425–7434.

Cited By

View all
  • (2023)DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadProceedings of the ACM on Human-Computer Interaction10.1145/36042537:MHCI(1-22)Online publication date: 13-Sep-2023
  • (2023)Generating Haptic Motion Effects for Multiple Articulated Bodies for Improved 4D Experiences: A Camera Space ApproachProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580727(1-17)Online publication date: 19-Apr-2023
  • (2023)Merging Camera and Object Haptic Motion Effects for Improved 4D Experiences2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00120(1036-1044)Online publication date: 16-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '21: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology
December 2021
563 pages
ISBN:9781450390927
DOI:10.1145/3489849
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 4D
  2. automatic generation
  3. image-based
  4. motion effect
  5. vestibular

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Samsung Research Funding & Incubation Center of Samsung Electronics

Conference

VRST '21

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)26
  • Downloads (Last 6 weeks)2
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadProceedings of the ACM on Human-Computer Interaction10.1145/36042537:MHCI(1-22)Online publication date: 13-Sep-2023
  • (2023)Generating Haptic Motion Effects for Multiple Articulated Bodies for Improved 4D Experiences: A Camera Space ApproachProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580727(1-17)Online publication date: 19-Apr-2023
  • (2023)Merging Camera and Object Haptic Motion Effects for Improved 4D Experiences2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00120(1036-1044)Online publication date: 16-Oct-2023
  • (2023)Sensory cue integration of visual and vestibular stimuli: a case study for 4D ridesVirtual Reality10.1007/s10055-023-00762-727:3(1671-1683)Online publication date: 9-Feb-2023
  • (2022)Motion Effects: Perceptual Space and Synthesis for Specific Perceptual PropertiesIEEE Transactions on Haptics10.1109/TOH.2022.319695015:3(626-637)Online publication date: 1-Jul-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media