Skip to main content
Log in

Haptic interaction with objects in a picture based on pose estimation

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In pictures, every object is displayed in 2D space. Seeing the 2D image, people can perceptually reconstruct and understand information regarding the scene. To enable users to haptically interact with an object that appears in the image, the present study proposes a geometry-based haptic rendering method. More specifically, our approach is intended to estimate haptic information from the object’s structure contained in an image while preserving the two-dimensional visual information. Of the many types of objects that can be seen in everyday pictures, this paper mainly deals with polyhedron figures or objects composed of rectangular faces, some of which might be shown in a slanted configuration in the picture. To obtain the geometric layout of the object being viewed from the image plane, we first estimate homographic information that describes a mapping from the object coordinate to the target image coordinate. Then, we transform the surface normals of the object face using the extrinsic part of homography that locates the face of the object we are viewing. Because the transformed normals are utilized for calculating the force in the image space, we call this process normal vector perturbation in the 2D image space. To physically represent the estimated normal vector without distorting the visual information, we employed a lateral haptic rendering scheme in that it fits with our interaction styles on 2D images. The active force value at a given position on the slanted faces is calculated during the interaction phase. To evaluate our approach, we conducted an experiment with different stimulus conditions, in which it was found that participants could reliably estimate the geometric layout that appears in the picture. We conclude with explorations of applications and a discussion of future work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Akamatsu M, Sato S (1994) A multi-modal mouse with tactile and force feedback. Int J Hum-Comput Stud 40(3):443–453. doi:10.1006/ijhc.1994.1020

    Google Scholar 

  2. Bach-y Rita P, Collins C, Saunders F, White B, Scadden L (1969) Vision substitution by tactile image projection. Nature 221:963–964

    Article  Google Scholar 

  3. Bau O, Poupyrev I, Israr A, Harrison C (2010) Teslatouch: electrovibration for touch surfaces. In: Proceedings of the 23nd annual ACM symposium on User interface software and technology, UIST ’10. ACM, New York, NY, USA, pp 283–292. doi:10.1145/1866029.1866074

  4. Bradski G, Kaehler A (2008) Learning OpenCV: Computer vision with the OpenCV library. O’Reilly Media, Incorporated

  5. Britton EG, Lipscomb JS, Pique ME (1978) Making nested rotations convenient for the user. SIGGRAPH Comput Graph 12(3):222–227. doi:10.1145/965139.807394

    Article  Google Scholar 

  6. Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M (2011) Augmented reality technologies, systems and applications. Multimeda Tools Appl 51(1):341–377. doi:10.1007/s11042-010-0660-6

    Article  Google Scholar 

  7. Cha J, Eid M, El Saddik A (2008) Dibhr: depth image-based haptic rendering. Haptics: Perception, Devices and Scenarios, pp 640–650

  8. Cha J, Eid M, Saddik AE (2009) Touchable 3d video system. ACM Trans Multimedia Comput Commun Appl 5(4):29:1–29:25. doi:10.1145/1596990.1596993

    Article  Google Scholar 

  9. Concolato C, Dufourd J (2002) Comparison of mpeg-4 bifs and some other multimedia description languages. In: Workshop and exhibition on MPEG-4, WEPM

  10. Dementhon DF, Davis LS (1995) Model-based object pose in 25 lines of code. Int J Comput Vis 15(1–2):123–141. doi:10.1007/BF01450852

    Article  Google Scholar 

  11. Fritz J, Barner K (1996) Stochastic models for haptic texture. In: Proceedings of SPIEs international symposium on intelligent systems and advanced manufacturing–telemanipulator and telepresence technologies III, pp 34–44

  12. Han BK, Kim SC, Lim SC, Pyo D, Kwon DS (2012) Physical mobile interaction with kinesthetic feedback. In: Haptics symposium (HAPTICS), 2012 IEEE, pp 571–575. doi:10.1109/HAPTIC.2012.6183849

  13. Hart S, Wickens C (1990) Workload assessment and prediction. Manprint, an approach to systems integration, pp 257–296

  14. Hartley R, Zisserman A (2000) Multiple view geometry in computer vision, vol 2. Cambridge Univ Press

  15. Hinckley K, Sinclair M, Hanson E, Szeliski R, Conway M (1999) The videomouse: a camera-based multi-degree-of-freedom input device. In: Proceedings of the 12th annual ACM symposium on user interface software and technology, UIST ’99. ACM, New York, NY, USA, pp 103–112. doi:10.1145/320719.322591

  16. Ho CH, Basdogan C, Srinivasan MA (1999) Efficient point-based rendering techniques for haptic display of virtual objects. Presence: Teleoper Virtual Environ 8(5):477–491. doi:10.1162/105474699566413

    Article  Google Scholar 

  17. Ikei Y, Wakamatsu K, Fukuda S (1997) Texture presentation by vibratory tactile display-image based presentation of a tactile texture. In: Virtual reality annual international symposium. IEEE 1997, pp 199–205, 219. doi:10.1109/VRAIS.1997.583071

  18. Israr A, Bau O, Kim SC, Poupyrev I (2012) Tactile feedback on flat surfaces for the visually impaired. In: Proceedings of the 2012 ACM annual conference extended abstracts on human factors in computing systems extended abstracts, CHI EA ’12. ACM, New York, NY, USA, pp 1571–1576. doi:10.1145/2223656.2223674

  19. Jones L (2000) Kinesthetic sensing. Human and Machine Haptics

  20. Kato H, Billinghurst M, Poupyrev I, Imamoto K, Tachibana K (2000) Virtual object manipulation on a table-top ar environment. In: Proceedings IEEE and ACM International symposium on augmented reality, (ISAR 2000). IEEE, pp 111–119

  21. Kim SC, Han BK, Yang JY, Kwon DS (2010) Interaction with objects inside a media space. In: ACM SIGGRAPH ASIA 2010 Posters, SA ’10. ACM: New York, NY, USA, pp. 39:1–39:1 doi:10.1145/1900354.1900397

  22. Kim SC, Kyung KU, Kwon DS (2011) Haptic annotation for an interactive image. In: Proceedings of the 5th international conference on ubiquitous information management and communication, ICUIMC ’11. ACM: New York, NY, USA, pp 51:1–51:10. doi:10.1145/1968613.1968675

  23. Kim Y, Cha J, Ryu J, Oakley I (2010) A tactile glove design and authoring system for immersive multimedia. IEEE MultiMedia 17(3):34 –45. doi:10.1109/MMUL.2010.5692181

    Article  Google Scholar 

  24. Kyung K, Kim S, Kwon D (2007) Texture display mouse: vibrotactile pattern and roughness display. IEEE/ASME Trans Mech 12(3):356–360

    Article  Google Scholar 

  25. Lederman S, Klatzky R (2009) Haptic perception: A tutorial. Atten. Percept. Psychophys. 71(7):1439–1459

    Article  Google Scholar 

  26. Lederman, SJ, Klatzky RL (1987) Hand movements: A window into haptic object recognition. Cognitive Psychol 19(3):342–368. doi:10.1016/0010-0285(87)90008-9. http://www.sciencedirect.com/science/article/pii

  27. Lederman SJ, Klatzky RL (1999) Sensing and displaying spatially distributed fingertip forces in haptic interfaces for teleoperator and virtual environment systems. Presence: Teleoper Virtual Environ 8(1):86–103. doi:10.1162/105474699566062

    Article  Google Scholar 

  28. Minsky M, Lederman S (1996) Simulated haptic textures: roughness. In: Proceedings of the ASME dynamic systems and control division, vol 58, pp 421–426

  29. Minsky M, Ming Oy, Steele O, Brooks Jr, FP, Behensky M (1990) Feeling and seeing: issues in force display. SIGGRAPH Comput Graph 24(2):235–241 . doi:10.1145/91394.91451

    Google Scholar 

  30. Morgenbesser H, Srinivasan M (1996) Force shading for haptic shape perception. In: Proceedings of the ASME dynamic systems and control division, vol 58, pp 407–412

  31. Nack F, Lindsay AT (1999) Everything you wanted to know about mpeg-7: part 1. IEEE Multimedia 6(3):65–77. doi:10.1109/93.790612

    Article  Google Scholar 

  32. Oakley I, McGee MR, Brewster S, Gray P (2000) Putting the feel in look and feel. In: Proceedings of the SIGCHI conference on Human factors in computing systems, CHI ’00. ACM: New York, NY, USA, pp 415–422. doi:10.1145/332040.332467

  33. Rekimoto J (1998) Matrix: a realtime object identification and registration method for augmented reality. In: Proceedings of the third asian pacific computer and human interaction, APCHI ’98. IEEE: Computer Society, Washington, DC, USA, pp 63–68. http://dl.acm.org/citation.cfm?id=786112.786329

  34. Robles-De-La-Torre G, Hayward V (2001) Force can overcome object geometry in the perception of shape through active touch. Nature 412(6845):445–448

    Article  Google Scholar 

  35. Ruspini DC, Kolarov K, Khatib O (1997) The haptic display of complex graphical environments. In: Proceedings of the 24th annual conference on computer graphics and interactive techniques, SIGGRAPH ’97. ACM: Press/Addison-Wesley Publishing Co., New York, NY, USA, pp 345–352

    Chapter  Google Scholar 

  36. Saga S, Deguchi K (2012) Lateral-force-based 2.5-dimensional tactile display for touch screen. In: Haptics symposium (HAPTICS). IEEE, pp 15–22. doi:10.1109/HAPTIC.2012.6183764

  37. Sherrick C, Cholewiak R (1986) Cutaneous sensitivity. Handbook of perception and human performance 1:1–12

    Google Scholar 

  38. Shimojo M, Shinohara M, Fukui Y (1999) Human shape recognition performance for 3d tactile display. IEEE Trans Syst Man Cybern Part A Syst Humans 29(6):637–644

    Article  Google Scholar 

  39. Srinivasa S, Ferguson D, Helfrich C, Berenson D, Collet A, Diankov R, Gallagher G, Hollinger G, Kuffner J, Weghe M (2010) Herb: a home exploring robotic butler. Auton Robot 28(1):5–20

    Article  Google Scholar 

  40. Srinivasan MA, Basdogan C (1997) Haptics in virtual environments: taxonomy, research status, and challenges. Comput Graph 21(4):393–404. doi:10.1016/S0097-8493(97)00030-7

    Article  Google Scholar 

  41. Szeliski R (2010) Computer vision: algorithms and applications. Springer

  42. Vasudevan H, Manivannan M (2008) Tangible images: Runtime generation of haptic textures from images. In: Proceedings of the 2008 Symposium on haptic interfaces for virtual environment and teleoperator systems, HAPTICS ’08. IEEE: Computer Society, Washington, DC, USA, pp 357–360. doi:10.1109/HAPTICS.2008.4479971

  43. Wagner D, Schmalstieg D (2007) Artoolkitplus for pose tracking on mobile devices. In: Proceedings of 12th Computer vision winter workshop (CVWW’07), pp 139–146

  44. Willis KD, Poupyrev I, Hudson SE, Mahler M (2011) Sidebyside: ad-hoc multi-user interaction with handheld projectors. In: Proceedings of the 24th annual ACM symposium on User interface software and technology, UIST ’11. ACM: New York, NY, USA, pp 431–440. doi:10.1145/2047196.2047254

  45. Willis KD, Poupyrev I, Shiratori T (2011) Motionbeam: a metaphor for character interaction with handheld projectors. In: Proceedings of the SIGCHI Conference on human factors in computing systems, CHI ’11. ACM: New York, NY, USA, pp 1031–1040. doi:10.1145/1978942.1979096

  46. Winfield L, Glassmire J, Colgate JE, Peshkin M (2007) T-pad: tactile pattern display through variable friction reduction. In: EuroHaptics conference, 2007 and Symposium on haptic interfaces for virtual environment and teleoperator systems. World Haptics 2007. Second joint, pp 421–426. doi:10.1109/WHC.2007.105

  47. Xu C, Israr A, Poupyrev I, Bau O, Harrison C (2011) Tactile display for the visually impaired using teslatouch. In: CHI ’11 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’11, pp 317–322. ACM, New York, NY, USA. doi:10.1145/1979742.1979705

  48. Yano H, Aoki T, Iwata H (2012) Handheld haptic interface with visual display for touching remote objects. In: Haptics Symposium (HAPTICS), 2012 IEEE, pp 349–354. doi:10.1109/HAPTIC.2012.6183813

  49. Yano H, Miyamoto Y, Iwata H (2009) Haptic interface for perceiving remote object using a laser range finder. In: Proceedings of the World Haptics 2009 - third joint EuroHaptics conference and symposium on haptic interfaces for virtual environment and teleoperator systems, WHC ’09. IEEE Computer Society, Washington, DC, USA, pp 196–201. doi:10.1109/WHC.2009.4810889

  50. Zilles CB, Salisbury JK (1995) A constraint-based god-object method for haptic display. In: Proceedings of the International conference on intelligent robots and systems, IROS ‘95, vol 3. IEEE: Computer Society, Washington, DC, USA, pp 146–151. http//:dl.acm.org/citation.cfm?id=846238.849727

Download references

Acknowledgements

This work was supported by the IT R&D program of MKE/KEIT, [2009-S-035-01, Contact-free Multipoint Realistic Interaction Technology Development].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dong-Soo Kwon.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kim, SC., Kwon, DS. Haptic interaction with objects in a picture based on pose estimation. Multimed Tools Appl 72, 2041–2062 (2014). https://doi.org/10.1007/s11042-013-1471-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-013-1471-3

Keywords

Navigation