Abstract
The number of research activities on multi-modal feedback cues and their potential to enhance the performance of human operators during teleoperation tasks is growing. Yet, it is still unclear how stimulus intensities of cues in different modalities should be matched when investigating the effects of feedback modality on task performance. Previous work has shown a high within- and between-subject variability of multi-modal intensity adjustments. The source of this variability is not yet clear. In this study, we investigate the individual perception of the cues by setting individual reference intensities for cross-modal matching. In addition to traditional frequentist models, Bayesian models are used for data analysis. The results suggest that haptic as well as auditory cue intensity adjustments are not influenced by modality when combining it with a visual reference. Thus, it seems sufficient to let subjects adjust the intensity for haptic and auditory cues individually prior to investigating the effectiveness of haptic-auditory cues. In contrast, with a 99% probability, visual cues are adjusted to lower intensities by subjects if combined with haptic or auditory cues, which points to the importance of individual cross-modal matching when investigating the effects of visual-haptic or visual-auditory feedback cues.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Lin, H.C., et al.: Portability and applicability of virtual fixtures across medical and manufacturing tasks. In: Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, pp. 225–230 (2006)
Abbott, J.J., Marayong, P., Okamura, A.M.: Haptic virtual fixtures for robot-assisted manipulation. In: Thrun, S., Brooks, R., Durrant-Whyte, H. (eds.) Robotics Research, vol. 28, pp. 49–64. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-48113-3_5
Gunn, D.V., et al.: Target acquisition with UAVs: vigilance displays and advanced cuing interfaces. Hum. Factors: J. Hum. Factors Ergon. Soc. 47(3), 488–497 (2005)
Nitsch, V., et al.: Bi-modal assistance functions and their effect on user perception and movement coordination with telesurgery systems. In: Proceedings of the 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE), Munich, pp. 32–37 (2012)
Benz, T.M., Nitsch, V.: Using multisensory cues for direction information in teleoperation: more is not always better. In: Proceedings of the IEEE International Conference on Robotics and Automation, Singapore (2017)
Pitts, B., Riggs, S.L., Sarter, N.: Crossmodal matching: a critical but neglected step in multimodal research. IEEE Trans. Hum-Mach. Syst. 46(3), 445–450 (2016)
Stevens, S.S.: Cross-modality validation of subjective scales for loudness, vibration, and electric shock. J. Exp. Psychol. 57(4), 201–209 (1959)
Colman, A.M.: A Dictionary of Psychology, Oxford Reference Online Premium (2009). http://www.oxfordreference.com/
Ernst, M.O., Bülthoff, H.H.: Merging the senses into a robust percept. Trends Cogn. Sci. 8(4), 162–169 (2004)
Ernst, M.O., Rohde, M., van Dam, L.C.J.: Statistically optimal multisensory cue integration: a practical tutorial. Multisens. Res. 29(4–5), 279–317 (2016)
Pitts, B.J., Sarter, N.B.: Crossmodal matching: a comparison of three techniques. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 59(1), 1316–1320 (2015)
Pitts, B.J., Lu, S.A., Sarter, N.B.: Cross-modal matching: the development and evaluation of a new technique. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 57(1), 1760–1764 (2013)
Pitts, B.J., Sarter, N.B.: Crossmodal matching: validation of a more reliable technique. In: Proceedings of the Human Factors and Ergonomics Society, vol. 2014, January 2014. https://www.scopus.com/inward/record.uri?eid=2-s2.0-84957654113&partnerID=40&md5=831f862e72a000f8e71f421f9fc69ef5
Gomes, K., Riggs, S.L.: Crossmodal matching. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 60(1), 1595–1599 (2016)
Krueger, L.E.: Reconciling Fechner and Stevens: toward a unified psychophysical law. Behav. Brain Sci. 12(02), 251 (1989)
Nevin, J.A.: Signal detection theory and operant behavior: a review of David M. Green and John A. Swets’ signal detection theory and psychophysics. J. Exp. Anal. Behav. 12(3), 475–480 (1969)
Rohmer, E., Singh, S.P.N., Freese, M.: V-REP: a versatile and scalable robot simulation framework. In: Proceedings of The 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, pp. 1321–1326 (2013)
Ergonomics of human-system interaction Part 306: Field assessment methods for electronic visual displays, 9241-306:2008 (2013)
Marks, L.E.: On associations of light and sound: the mediation of brightness, pitch, and loudness. Am. J. Psychol. 87(1/2), 173 (1974)
Spence, C.: Crossmodal correspondences: a tutorial review. Atten. Percept. Psychophys. 73(4), 971–995 (2011)
Gescheider, G.A.: Psychophysics: The Fundamentals. Psychology Press, Hove (2013)
Bond, B., Stevens, S.S.: Cross-modality matching of brightness to loudness by 5-year-olds. Percept. Psychophys. 6(6), 337–339 (1969)
Kruschke, J.K.: Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, 2nd edn. AP Academic Press/Elsevier, Amsterdam (2015)
Sorensen, T., Hohenstein, S., Vasishth, S.: Bayesian linear mixed models using Stan: a tutorial for psychologists, linguists, and cognitive scientists. TQMP 12(3), 175–200 (2016)
Acknowledgment
This work was supported by the German Research Foundation (DFG).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Benz, T.M., Nitsch, V. (2018). Is Cross-Modal Matching Necessary? A Bayesian Analysis of Individual Reference Cues. In: Prattichizzo, D., Shinoda, H., Tan, H., Ruffaldi, E., Frisoli, A. (eds) Haptics: Science, Technology, and Applications. EuroHaptics 2018. Lecture Notes in Computer Science(), vol 10893. Springer, Cham. https://doi.org/10.1007/978-3-319-93445-7_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-93445-7_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-93444-0
Online ISBN: 978-3-319-93445-7
eBook Packages: Computer ScienceComputer Science (R0)