Skip to main content
Log in

Effect of stimulus intensity on response time distribution in multisensory integration

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

To increase the efficiency of multimodal user interfaces, one has to design them according to how multimodal features appear in the real world. Although spatial coincidence and matching intensity levels are important for perception, these factors received little attention in human–computer interaction studies. In our present study we aimed to map how spatial coincidence and different intensity levels influence response times. Sixteen participants performed a simple auditory localization task, where sounds were presented either alone or together with visual non-targets. We found that medium intensity visual stimuli facilitated responses to low intensity sounds. Analyses of response time distributions showed that intensity of target and non-target stimuli affected different parameters of the ex-Gaussian distribution. Our results suggest that multisensory integration and response facilitation may occur even if the non-target has low predictive power to the location of the target. Furthermore, we show that the parameters of the ex-Gaussian distribution can be related to distinct cognitive processes. The current results are potentially applicable in the design of an intelligent warning system that employs the user’s reaction time to adapt the warning signal for optimal results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Barany P, Csapo A (2012) Definition and synergies of cognitive infocommunications. Acta Polytech Hung 9(1):67–83

    Google Scholar 

  2. Ho C, Reed N, Spence C (2007) Multisensory in-car warning signals for collision avoidance. Human Factors 49(6):1107–1114

    Article  Google Scholar 

  3. Ho C, Santangelo V, Spence C (2009) Multisensory warning signals: when spatial correspondence matters. Expl Brain Res 195(2):261–272

    Article  Google Scholar 

  4. Meredith MA, Stein BE (1986) Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56(3):640–662

    Google Scholar 

  5. Meredith MA, Stein BE (1983) Interactions among converging sensory inputs in the superior colliculus. Science (New York, N.Y.) 221(4608):389–391

    Article  Google Scholar 

  6. Stein BE, Meredith MA (1993) The merging of the senses. MIT Press, p 224

  7. Ohshiro T, Angelaki DE, DeAngelis GC (2011) A normalization model of multisensory integration. Nat Neurosci 14(6):775–782

    Article  Google Scholar 

  8. Ghazanfar AA, Schroeder CE (2006) Is neocortex essentially multisensory? Trends Cognit Sci 10(6):278–285

    Article  Google Scholar 

  9. McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264(5588):746–748

    Article  Google Scholar 

  10. Senkowski D, Saint-Amour D, Höfle M, Foxe JJ (2011) Multisensory interactions in early evoked brain activity follow the principle of inverse effectiveness. NeuroImage 56(4):2200–2208

    Article  Google Scholar 

  11. Bolognini N, Leo F, Passamonti C, Stein BE, Làdavas E (2007) Multisensory-mediated auditory localization. Perception 36(10):1477–1485

    Article  Google Scholar 

  12. Leo F, Bolognini N, Passamonti C, Stein BE, Làdavas E (2008) Cross-modal localization in hemianopia: new insights on multisensory integration. Brain 131(Pt 3):855–865

    Article  Google Scholar 

  13. Colonius H, Diederich A (2012) Focused attention vs. crossmodal signals paradigm: deriving predictions from the time-window-of-integration model. Front Integr Neurosci 6:62

    Article  Google Scholar 

  14. Morgan ML, Deangelis GC, Angelaki DE (2008) Multisensory integration in macaque visual cortex depends on cue reliability. Neuron 59(4):662–673

    Article  Google Scholar 

  15. Fetsch CR, Pouget A, DeAngelis GC, Angelaki DE (2012) Neural correlates of reliability-based cue weighting during multisensory integration. Nat Neurosci 15(1):146–154

    Article  Google Scholar 

  16. Lacouture Y, Cousineau D (2008) How to use MATLAB to fit the ex-Gaussian and other probability functions to a distribution of response times. Tutor Quant Methods Psychol 4(1):35–45

    Google Scholar 

  17. Whelan R (2008) Effective analysis of reaction time data. Psychol Record 58(3):475–482

    Google Scholar 

  18. Török Á, Asbóth KK, Honbolygó F, Csépe V (2012) Intensity dependent interaction in audiovisual integration. In: Proceedings of the 3rd IEEE conference on cognitive infocommunications, pp 469–473

  19. Boersma P (2001) Praat, a system for doing phonetics by computer. Glot Int 5(9–10):341–345

    Google Scholar 

  20. Miller J (1988) A warning about median reaction time. J Exp Psychol 14(3):539–543

    Google Scholar 

  21. Diederich A, Colonius H (2004) Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time. Percept Psychophys 66(8):1388–1404

    Article  Google Scholar 

  22. Di Russo F, Martínez A, Sereno MI, Pitzalis S, Hillyard SA (2002) Cortical sources of the early components of the visual evoked potential. Hum Brain Mapp 15(2):95–111

    Article  Google Scholar 

  23. Kuriki S, Nogai T, Hirata Y (1995) Cortical sources of middle latency responses of auditory evoked magnetic field. Hear Res 92(1–2):47–51

    Article  Google Scholar 

  24. Fort A, Delpuech C, Pernier J, Giard M-H (2002) Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. Cerebral Cortex (New York, N.Y.: 1991) 12(10):1031–1039

    Article  Google Scholar 

  25. Giard MH, Peronnet F (1999) Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J Cognit Neurosci 11(5):473–490

    Article  Google Scholar 

  26. Perrault TJ, Vaughan JW, Stein BE, Wallace MT (2005) Superior colliculus neurons use distinct operational modes in the integration of multisensory stimuli. J Neurophysiol 93(5):2575–2586

    Article  Google Scholar 

  27. Alais D, Burr D (2004) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14(3):257–262

    Article  Google Scholar 

  28. Wozny DR, Shams L (2011) Recalibration of auditory space following milliseconds of cross-modal discrepancy. J Neurosci 31(12):4607–4612

    Google Scholar 

  29. Fujisaki W, Shimojo S, Kashino M, Nishida S (2004) Recalibration of audiovisual simultaneity. Nat Neurosci 7(7):773–778

    Google Scholar 

  30. Bertelson P, Aschersleben G (1998) Automatic visual bias of perceived auditory location. Psychon Bull Rev 5(3):482–489

    Article  Google Scholar 

  31. Vroomen J, De Gelder B (2004) Perceptual effects of cross-modal stimulation: ventriloquism and the freezing phenomenon. Handb Multisens Process 3(1):1–23

    Google Scholar 

  32. Diederich A, Schomburg A, Colonius H (2012) Saccadic reaction times to audiovisual stimuli show effects of oscillatory phase reset. PLoS ONE 7(10):e44910

    Article  Google Scholar 

  33. Ghirardelli TG, Scharine AA (2009) Auditory-visual interactions. In: Letowski, Russo MB, Tomasz R (eds) Helmet-mounted displays: sensation, perception, and cognition issues. U.S. Army Aeromedical Research, pp 599–618

Download references

Acknowledgments

The publication was supported by the KTIA_AIK_12-1-2013-0037 project. The project is supported by Hungarian Government, managed by the National Development Agency, and financed by the Research and Technology Innovation Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ágoston Török.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Cite this article

Török, Á., Kolozsvári, O., Virágh, T. et al. Effect of stimulus intensity on response time distribution in multisensory integration. J Multimodal User Interfaces 8, 209–216 (2014). https://doi.org/10.1007/s12193-013-0135-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-013-0135-y

Keywords

Navigation