skip to main content
article

Haptic discrimination of force direction and the influence of visual information

Published:01 April 2006Publication History
Skip Abstract Section

Abstract

Despite a wealth of literature on discrimination thresholds for displacement, force magnitude, stiffness, and viscosity, there is currently a lack of data on our ability to discriminate force directions. Such data are needed in designing haptic rendering algorithms where force direction, as well as force magnitude, are used to encode information such as surface topography. Given that haptic information is typically presented in addition to visual information in a data perceptualization system, it is also important to investigate the extent to which the congruency of visual information affects force-direction discrimination. In this article, the authors report an experiment on the discrimination threshold of force directions under the three display conditions of haptics alone (H), haptics plus congruent vision (HVcong), and haptics plus incongruent vision (HVincong). Average force-direction discrimination thresholds were found to be 18.4°, 25.6°, and 31.9° for the HVcong, H and HVincong conditions, respectively. The results show that the congruency of visual information significantly affected haptic discrimination of force directions, and that the force-direction discrimination thresholds did not seem to depend on the reference force direction. The implications of the results for designing haptic virtual environments, especially when the numbers of sensors and actuators in a haptic display do not match, are discussed.

References

  1. Armstrong, L. and Marks, L. E. 1999. Haptic perception of linear extent. Perception Psychophys. 61, 1211--1226.Google ScholarGoogle Scholar
  2. Barbagli, F. and Salisbury, K. 2003. The effect of sensor/actuator asymmetries in haptic interfaces. In Proceedings of the 11th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. 140--147. Google ScholarGoogle Scholar
  3. Brisben, A. J., Hsiao, S. S., and Johnson, K. O. 1999. Detection of vibration transmitted through an object grasped in the hand. J. Neurophysiol. 81, 1548--1558.Google ScholarGoogle Scholar
  4. Calvert, G., Spence, C., and Stein, B. E. (Eds.) 2004. The Handbook of Multisensory Processes. MIT Press, Cambridge, MA.Google ScholarGoogle Scholar
  5. Clark, F. J. 1992. How accurately can we perceive the position of our limbs? Behavioral Brain Sci. 15, 725--726.Google ScholarGoogle Scholar
  6. Clark, F. J., Burgess, R. C., and Chapin, J. W. 1986. Proprioception with the proximal interphalangeal joint of the index finger: Evidence for a movement sense without a static-position sense. Brain 109, 1195--1208.Google ScholarGoogle Scholar
  7. DiFranco, D. E., Beauregard, G. L., and Srinivasan, M. A. 1997. The effect of auditory cues on the haptic perception of stiffness in virtual environments. In Proceedings of the ASME Dynamic Systems and Control Division, DSC-Vol. 61, 17--22.Google ScholarGoogle Scholar
  8. Durfee, W. K., Hendrix, C. M., Cheng, P., and Varughese, G. 1997. Influence of haptic and visual displays on the estimation of virtual environment stiffness. In Proceedings of the ASME Dynamic Systems and Control Division, DSC-Vol. 61, 139--144.Google ScholarGoogle Scholar
  9. Durlach, N. I., Delhorne, L. A., Wong, A., Ko, W. Y., Rabinowitz, W. M., and Hollerbach, J. 1989. Manual discrimination and identification of length by the finger-span method. Perception Psychophys. 46, 29--38.Google ScholarGoogle Scholar
  10. Ernst, M. O. and Banks, M. S. 2002. Humans integrate visual and haptic information in a statistically optimal fashion. Nature (London) 415, 429--433.Google ScholarGoogle Scholar
  11. Guest, S. and Spence, C. 2003. What role does multisensory integration play in the visuotactile perception of textute? Int. J. Psychophysiol. 50, 63--80.Google ScholarGoogle Scholar
  12. Hayward, V. and Astley, O. R. 1996. Performance measures for haptic interfaces. In Robotics Research: The 7th International Symposium. G. Giralt and G. Hirzinger, Eds. Springer Verlag, New York, 195--207.Google ScholarGoogle Scholar
  13. Heller, M. A., Calcaterra, J. A., Green, S. L., and Brown, L. 1999. Intersensory conflict between vision and touch: The response modality dominates when precise, attention-riveting judgments are required. Perception Psychophys. 61, 1384--1398.Google ScholarGoogle Scholar
  14. Hendrix, C. M., Cheng, P.-M., and Durfee, W. K. 1999. Relative influence of sensory cues in a multi-modal virtual environment, In Proceedings the ASME Dynamic Systems and Control Division, Vol. DSC-Vol. 67. ASME, 59--64.Google ScholarGoogle Scholar
  15. Hillis, J. M., Ernst, M. O., Banks, M. S., and Landy, M. S. 2002. Combining sensory information: Mandatory fusion within, but not between, senses, Science 298, 1627--1630.Google ScholarGoogle Scholar
  16. Jones, L. A. 1989. Matching forces: Constant errors and differential thresholds. Perception 18, 681--687.Google ScholarGoogle Scholar
  17. Jones, L. A. and Hunter, I. W. 1990. A perceptual analysis of stiffness. Experimental Brain Res. 79, 150--156.Google ScholarGoogle Scholar
  18. Jones, L. A. and Hunter, I. W. 1992. Differential thresholds for limb movement measured using adaptive techniques. Perception Psychophys. 52, 529--535.Google ScholarGoogle Scholar
  19. Jones, L. A. and Hunter, I. W. 1993. A perceptual analysis of viscosity. Experimental Brain Res. 94, 343--351.Google ScholarGoogle Scholar
  20. Klein, R. M. 1977. Attention and visual dominance: A chronometric analysis. J. Exp. Psychol. Hum. Perception Performance 3, 365--378.Google ScholarGoogle Scholar
  21. Klatzky, R. L., Lederman, S., and Reed, C. 1987. There's more to touch than meets the eye: The salience of object attributes for haptics with and without vision. J. Exp. Psychol. General, 116, 356--369.Google ScholarGoogle Scholar
  22. Lawrence, D. A., Pao, L. Y., Dougherty, A. M., Salada, M. A., and Pavlou, Y. 2000. Rate-hardness: A new performance metric for haptic interface. IEEE Transactions on Robotics and Automation 16, 357--371.Google ScholarGoogle Scholar
  23. Lederman, S. J. and Klatzky, R. L. 2004. Multisensory texture perception. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence, and B. E. Stein, Eds. MIT Press, Cambridge, MA, 107--122.Google ScholarGoogle Scholar
  24. Lederman, S. J., Thorne, G., and Jones, B. 1986. Perception of texture by vision and touch: Multidimensionality and intersensory integration. J. Exp. Psychol. Hum. Perception Performance 12, 169--180.Google ScholarGoogle Scholar
  25. Levitt, H. 1971. Transformed up-down methods in psychoacoustics. J. Acoustical Soc. Am. 49, 467--477.Google ScholarGoogle Scholar
  26. O'Malley, M. and Goldfarb, M. 2002. The implications of surface stiffness for size identification and perceived surface hardness in haptic interfaces. In Proceedings of the IEEE International Conference on Robotics and Automation. 1255--1260.Google ScholarGoogle Scholar
  27. Otaduy, M. A. and Lin, M. C. 2003. Sensation preserving simplification for haptic rendering. ACM Trans. Graphics 22, 543--553. Google ScholarGoogle Scholar
  28. Pang, X. D., Tan, H. Z., and Durlach, N. I. 1991. Manual discrimination of force using active finger motion. Perception Psychophys. 49, 531--540.Google ScholarGoogle Scholar
  29. Robles-De-La-Torre, G. and Hayward, V. 2001. Force can overcome object geometry in the perception of shape through active touch. Nature (London) 412, 445--448.Google ScholarGoogle Scholar
  30. Rock, I. and Victor, J. 1964. Vision and touch: An experimentally created conflict between the senses. Science 143, 594--596.Google ScholarGoogle Scholar
  31. Roder, B., Rosler, F., and Spence, C. 2004. Early vision impairs tactile perception in the blind. Curr. Biol. 14, 121--124.Google ScholarGoogle Scholar
  32. Soto-Faraco, S., Spence, C., Fairbank, K., Kingstone, A., Hillstrom, A. P., and Shapiro, K. 2002. A crossmodal attentional blink between vision and touch. Psychonomic Bull. Rev. 9, 731--738.Google ScholarGoogle Scholar
  33. Spence, C. and Driver, J. Eds. 2004. Crossmodal Space and Crossmodal Attention. Oxford University Press, Oxford.Google ScholarGoogle Scholar
  34. Spence, C. and Driver, J. 1997. Cross-modal links in attention between audition, vision, and touch: Implications for interface design. Int. J. Cognitive Ergonomics 1, 351--373.Google ScholarGoogle Scholar
  35. Spence, C., Pavani, F., and Driver, J. 2000. Crossmodal links between vision and touch in covert endogenous spatial attention. J. Exp. Psychol. Hum. Perception Performance 26, 1298--1319.Google ScholarGoogle Scholar
  36. Srinivasan, M. A., Beauregard, G. L., and Brock, D. L. 1996. The impact of visual information on the haptic perception of stiffness in virtual environments. In Proceedings of the 5th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, K. Danai, Ed., Vol. 58. Atlanta, GA: ASME, 555--559.Google ScholarGoogle Scholar
  37. Tan, H. Z., Durlach, N. I., Beauregard, G. L., and Srinivasan, M. A. 1995. Manual discrimination of compliance using active pinch grasp: The roles of force and work cues. Perception Psychophys. 57, 495--510.Google ScholarGoogle Scholar
  38. Van Beers, R. J., Wolpert, D. M., and Haggard, P. 2002. When feeling is more important than seeing in sensorimotor adaptation. Curr. Biol. 12, 834--837.Google ScholarGoogle Scholar

Index Terms

  1. Haptic discrimination of force direction and the influence of visual information

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in

              Full Access

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader