Skip to main content
Log in

Real-time haptic interaction with RGBD video streams

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Video interaction is a common way of communication in cyberspace. It can become more immersive by incorporating haptic modality. Using commonly available depth sensing controllers like Microsoft Kinect, information about the depth of a scene can be captured in real-time together with the video. In this paper, we present a method for real-time haptic interaction with videos containing depth data. Forces are computed based on the depth information. Spatial and temporal filtering of the depth stream is used to provide stability of force feedback delivered to the haptic device. Fast collision detection ensures the proposed approach to be used in real-time. We present an analysis of various factors that affect algorithm performance. The usefulness of the approach is illustrated by highlighting possible application scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Reiner, M.: The role of haptics in immersive telecommunication environments. IEEE Trans. Circuits Syst. Video Technol. 14(3), 392–401 (2004). doi:10.1109/TCSVT.2004.823399

    Article  Google Scholar 

  2. Rasool, S., Sourin, A.: Tangible images. In: Paper Presented at the SIGGRAPH Asia 2011 Sketches, Hong Kong

  3. Rasool, S., Sourin, A.: Image-driven haptic rendering in virtual environments. In: International Conference on Cyberworlds (CW), pp. 286–293 (2013)

  4. Vasudevan, H., Manivannan, M.: Tangible images: runtime generation of haptic textures from images. In: Paper Presented at the Proceedings of the 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Reno

  5. Li, J., Song, A., Zhang, X.: Image-based haptic texture rendering. In: Paper Presented at the Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry, South Korea

  6. Wu, J., Song, A., Zou, C.: A novel haptic texture display based on image processing. In: IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, pp. 1315–1320 (2007)

  7. Xu, S., Li, C., Hu, L., Jiang, S., Liu, X.P.: An improved switching vector median filter for image-based haptic texture generation. In: 5th International Congress on Image and Signal Processing (CISP), pp. 1195–1199 (2012)

  8. Adi, W., Sulaiman, S.: Haptic texture rendering based on visual texture information: a study to achieve realistic haptic texture rendering. In: Visual Informatics: Bridging Research and Practice, vol. 5857, pp. 279–287. Springer, Kuala Lumpur (2009)

  9. Ikei, Y., Wakamatsu, K., Fukuda, S.: Image data transformation for tactile texture display. In: Proceedings of IEEE virtual reality annual international symposium, Atlanta, pp. 51–58 (1998)

  10. Kim, S.-C., Kwon, D.-S.: Haptic interaction with objects in a picture based on pose estimation. In: Multimedia Tools and Applications, pp. 1–22 (2013). doi:10.1007/s11042-013-1471-3

  11. Kagawa, T., Shimamoto, T., Nishino, H., Utsumiya, K.: A study of haptic interaction for image edition tools. In: International Conference on Complex, Intelligent and Software Intensive Systems, Krakow, pp. 1135–1140 (2010)

  12. Lareau, D., Lang, J.: Instrument for haptic image exploration. IEEE Trans. Instrum. Meas. 63(1), 35–45 (2014)

    Article  Google Scholar 

  13. O’Modhrain, S., Oakley, I.: Adding interactivity: active touch in broadcast media. In: Proceedings of 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS’04)., Chicago, pp. 293–294 (2004)

  14. Gaw, D., Morris, D., Salisbury, K.: Haptically annotated movies: reaching out and touching the silver screen. In: 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, USA, pp. 287–288 (2006)

  15. Sung, M.Y., Jun, K., Ji, D., Lee, H., Kim, K.: Touchable video and tactile audio. In: 11th IEEE International Symposium on Multimedia (ISM’09), San Diego, pp. 425–431 (2009)

  16. Cha, J., Eid, M., Saddik, A.E.: Touchable 3D video system. ACM Trans Multimed. Comput. Commun. Appl. TOMMCAP 5(4), 1–25 (2009)

    Article  Google Scholar 

  17. Dindar, N., Teklap, A.M., Basdogan, C.: Immersive haptic interaction with media. In: Visual Communications and Image Processing, Huangshan. SPIE (2010)

  18. Rasool, S., Sourin, A.: Towards tangible images and video in cyberworlds—function-based approach. In: International Conference on Cyberworlds (CW), Singapore, pp. 92–96 (2010)

  19. Israr, A., Kim, S.-C., Stec, J., Poupyrev, I.: Surround haptics: tactile feedback for immersive gaming experiences. In: Paper Presented at the CHI’12 Extended Abstracts on Human Factors in Computing Systems, Austin

  20. Schneider, O., Zhao, S., Israr, A.: FeelCraft: user-crafted tactile content. Haptic interaction. In: Lecture Notes in Electrical Engineering, vol. 277, pp. 253–259. Springer, New York (2015)

  21. Danieau, F., Fleureau, J., Guillotel, P., Mollet, N., Christie, M., Lecuyer, A.: Toward haptic cinematography: enhancing movie experiences with camera-based haptic effects. Multimed. IEEE 21(2), 11–21 (2014). doi:10.1109/MMUL.2013.64

    Article  Google Scholar 

  22. Danieau, F.: Contribution to the study of haptic feedback for improving the audiovisual experience. Ph.D. dissertation, University of Rennes-1 (2014)

  23. Jayasiri, A., Honda, K., Akahane, K., Sato, M.: Feeling wind: an interactive haptization system for motion rendering in video contents using SPIDAR. In: 23rd International Conference on Artificial Reality and Telexistence (ICAT), pp. 54–60 (2013)

  24. Ryden, F., Nia Kosari, S., Chizeck, H.J.: Proxy method for fast haptic rendering from time varying point clouds. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2614–2619 (2011)

  25. Chung Hyuk, P., Howard, A.M.: Real-time haptic rendering and haptic telepresence robotic system for the visually impaired. In: World Haptics Conference (WHC), pp. 229–234 (2013)

  26. Frati, V., Prattichizzo, D.: Using Kinect for hand tracking and rendering in wearable haptics. In: World Haptics Conference (WHC). IEEE, pp. 317–321 (2011)

  27. Abramov, A., Pauwels, K., Papon, J., Worgotter, F., Dellen, B.: Depth-supported real-time video segmentation with the Kinect. In: IEEE Workshop on Applications of Computer Vision (WACV), pp. 457–464 (2012)

  28. Nguyen, C.V., Izadi, S., Lovell, D.: Modeling Kinect sensor noise for improved 3D reconstruction and tracking. In: 2nd International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT), pp. 524–530 (2012)

Download references

Acknowledgments

This research was supported by the National Research Foundation, Prime Minister’s Office, Singapore under its International Research Centers in Singapore Funding Initiative, and research grant MOE T1 RG 17/15 “Haptic Interaction with Images and Videos”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shahzad Rasool.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rasool, S., Sourin, A. Real-time haptic interaction with RGBD video streams. Vis Comput 32, 1311–1321 (2016). https://doi.org/10.1007/s00371-016-1224-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-016-1224-1

Keywords

Navigation