Skip to main content
Log in

Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference

  • Research Report
  • Published:
Cognitive Processing Aims and scope Submit manuscript

Abstract

According to embodied cognition, bodily interactions with our environment shape the perception and representation of our body and the surrounding space, that is, peripersonal space. To investigate the adaptive nature of these spatial representations, we introduced a multisensory conflict between vision and proprioception in an immersive virtual reality. During individual bimanual interaction trials, we gradually shifted the visual hand representation. As a result, participants unknowingly shifted their actual hands to compensate for the visual shift. We then measured the adaptation to the invoked multisensory conflict by means of a self-localization and an external localization task. While effects of the conflict were observed in both tasks, the effects systematically interacted with the type of localization task and the available visual information while performing the localization task (i.e., the visibility of the virtual hands). The results imply that the localization of one’s own hands is based on a multisensory integration process, which is modulated by the saliency of the currently most relevant sensory modality and the involved frame of reference. Moreover, the results suggest that our brain strives for consistency between its body and spatial estimates, thereby adapting multiple, related frames of reference, and the spatial estimates within, due to a sensory conflict in one of them.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Notes

  1. Due to the uneven number of participants, there is one more data sample for the visible–invisible condition.

  2. Data is available at the igroup website: http://www.igroup.org/pq/ipq/data.php.

References

  • Barr DJ, Levy R, Scheepers C, Tily HJ (2013) Random effects structure for confirmatory hypothesis testing: keep it maximal. J Mem Lang 68(3):255–278

    Article  Google Scholar 

  • Barsalou LW (2008) Grounded cognition. Annu Rev Psychol 59:617–645

    Article  PubMed  Google Scholar 

  • Bates D, Mächler M, Bolker B, Walker S (2015) Fitting linear mixed-effects models using lme4. J Stat Softw 67(1):1–48

    Article  Google Scholar 

  • Botvinick M, Cohen J (1998) Rubber hands’feel’ touch that eyes see. Nature 391(6669):756

    Article  CAS  PubMed  Google Scholar 

  • Butz MV, Herbort O, Hoffmann J (2007) Exploiting redundancy for flexible behavior: unsupervised learning in a modular sensorimotor control architecture. Psychol Rev 114:1015–1046

    Article  PubMed  Google Scholar 

  • Butz MV, Kutter EF, Lorenz C (2014) Rubber hand illusion affects joint angle perception. PLoS ONE 9(3):e92854

    Article  PubMed  PubMed Central  Google Scholar 

  • Canzoneri E, Ubaldi S, Rastelli V, Finisguerra A, Bassolino M, Serino A (2013) Tool-use reshapes the boundaries of body and peripersonal space representations. Exp Brain Res 228(1):25–42

    Article  PubMed  Google Scholar 

  • Coello Y, Bartolo A, Amiri B, Devanne H, Houdayer E, Derambure P (2008) Perceiving what is reachable depends on motor representations: evidence from a transcranial magnetic stimulation study. PLoS ONE 3(8):e2862

    Article  PubMed  PubMed Central  Google Scholar 

  • Cohen YE, Andersen RA (2002) A common reference frame for movement plans in the posterior parietal cortex. Nat Rev Neurosci 3(7):553–562

    Article  CAS  PubMed  Google Scholar 

  • Ehrenfeld S, Butz MV (2013) The modular modality frame model: continuous body state estimation and plausibility-weighted information fusion. Biol Cybern 107(1):61–82

    Article  PubMed  Google Scholar 

  • Ehrenfeld S, Herbort O, Butz MV (2013) Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference. Front Comput Neurosci 7 (Article UNSP 148)

  • Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415(6870):429–433

    Article  CAS  PubMed  Google Scholar 

  • Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8(4):162–169

    Article  PubMed  Google Scholar 

  • Farnè A, Làdavas E (2000) Dynamic size-change of hand peripersonal space following tool use. NeuroReport 11(8):1645–1649

    Article  PubMed  Google Scholar 

  • Fogassi L, Gallese V, Fadiga L, Luppino G, Matelli M, Rizzolatti G (1996) Coding of peripersonal space in inferior premotor cortex (area f4). J Neurophysiol 76(1):141–157

    CAS  PubMed  Google Scholar 

  • Gamberini L, Seraglia B, Priftis K (2008) Processing of peripersonal and extrapersonal space using tools: evidence from visual line bisection in real and virtual environments. Neuropsychologia 46(5):1298–1304

    Article  PubMed  Google Scholar 

  • Gentilucci M, Jeannerod M, Tadary B, Decety J (1994) Dissociating visual and kinesthetic coordinates during pointing movements. Exp Brain Res 102(2):359–366

    Article  CAS  PubMed  Google Scholar 

  • Glenberg AM, Witt JK, Metcalfe J (2013) From the revolution to embodiment 25 years of cognitive psychology. Perspect Psychol Sci 8(5):573–585

    Article  PubMed  Google Scholar 

  • Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720

    Article  PubMed  PubMed Central  Google Scholar 

  • Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6(2):65–70

    Google Scholar 

  • Holmes NP, Spence C (2004) The body schema and multisensory representation(s) of peripersonal space. Cogn Process 5(2):94–105

    Article  PubMed  PubMed Central  Google Scholar 

  • Holmes NP, Spence C (2005) Visual bias of unseen hand position with a mirror: spatial and temporal factors. Exp Brain Res 166(3–4):489–497

    Article  PubMed  PubMed Central  Google Scholar 

  • Holmes NP, Snijders HJ, Spence C (2006) Reaching with alien limbs: visual exposure to prosthetic hands in a mirror biases proprioception without accompanying illusions of ownership. Percept Psychophys 68(4):685–701

    Article  PubMed  PubMed Central  Google Scholar 

  • Iriki A, Tanaka M, Iwamura Y (1996) Coding of modified body schema during tool use by macaque postcentral neurones. NeuroReport 7(14):2325–2330

    Article  CAS  PubMed  Google Scholar 

  • Jewell G, McCourt ME (2000) Pseudoneglect: a review and meta-analysis of performance factors in line bisection tasks. Neuropsychologia 38(1):93–110

    Article  CAS  PubMed  Google Scholar 

  • Kirsch W, Kunde W (2013) Visual near space is scaled to parameters of current action plans. J Exp Psychol Hum Percept Perform 39(5):1313–1325

    Article  PubMed  Google Scholar 

  • Kirsch W, Herbort O, Butz MV, Kunde W (2012) Influence of motor planning on distance perception within the peripersonal space. PLoS ONE 7(4):e34880

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  • Kuznetsova A, Bruun Brockhoff P, Haubo Bojesen Christensen R (2016) lmerTest: tests in linear mixed effects models. https://CRAN.R-project.org/package=lmerTest, r package version 2.0-32

  • Lawrence MA (2015) ez: easy analysis and visualization of factorial experiments. https://CRAN.R-project.org/package=ez, r package version 4.3

  • Linkenauger SA, Witt JK, Proffitt DR (2011) Taking a hands-on approach: apparent grasping ability scales the perception of object size. J Exp Psychol Hum Percept Perform 37(5):1432–1441

    Article  PubMed  Google Scholar 

  • Linkenauger SA, Bülthoff HH, Mohler BJ (2015) Virtual arm’s reach influences perceived distances but only after experience reaching. Neuropsychologia 70:393–401

    Article  PubMed  Google Scholar 

  • Longo MR, Lourenco SF (2006) On the nature of near space: effects of tool use and the transition to far space. Neuropsychologia 44(6):977–981

    Article  PubMed  Google Scholar 

  • Longo MR, Lourenco SF (2007) Space perception and body morphology: extent of near space scales with arm length. Exp Brain Res 177(2):285–290

    Article  PubMed  Google Scholar 

  • Lourenco SF, Longo MR (2009) The plasticity of near space: evidence for contraction. Cognition 112(3):451–456

    Article  PubMed  Google Scholar 

  • Macaluso E, Maravita A (2010) The representation of space near the body through touch and vision. Neuropsychologia 48(3):782–795

    Article  CAS  PubMed  Google Scholar 

  • McGuire LM, Sabes PN (2009) Sensory transformations and the use of multiple reference frames for reach planning. Nat Neurosci 12(8):1056–1061

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  • Mohler BJ, Bu¨lthoff HH, Thompson WB, Creem-Regehr SH (2008) A full-body avatar improves egocentric distance judgments in an immersive virtual environment. In: Proceedings of the 5th symposium on applied perception in graphics and visualization, ACM

  • R Core Team (2016) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna. https://www.R-project.org/

  • Regenbrecht H, Schubert T (2002) Real and illusory interactions enhance presence in virtual environments. Presence Teleoper Virt 11(4):425–434

    Article  Google Scholar 

  • Rossetti Y, Desmurget M, Prablanc C (1995) Vectorial coding of movement: vision, proprioception, or both? J Neurophysiol 74(1):457–463

    CAS  PubMed  Google Scholar 

  • Schroeder PA, Lohmann J, Butz MV, Christian P (2016) Behavioral bias for food reflected in hand movements: a preliminary study with healthy subjects. Cyberpsychol Behav Soc Netw 19:120–126. doi:10.1089/cyber.2015.0311

    Article  PubMed  Google Scholar 

  • Schubert T, Friedmann F, Regenbrecht H (2001) The experience of presence: factor analytic insights. Presence 10(3):266–281

    Article  Google Scholar 

  • Simons DJ, Chabris CF (1999) Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception 28(9):1059–1074

    Article  CAS  PubMed  Google Scholar 

  • Simons DJ, Levin DT (1997) Change blindness. Trends Cogn Sci 1(7):261–267

    Article  CAS  PubMed  Google Scholar 

  • Wood G, Willmes K, Nuerk HC, Fischer MH (2008) On the cognitive link between space and number: a meta-analysis of the SNARC effect. Psychol Sci Q 50(4):489–525

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Johannes Lohmann.

Additional information

Handling editor: Sergei Gepshtein (Salk Institute for Biological Studies, La Jolla); Reviewers: Joseph Snider (University of California San Diego), Loes van Dam (University of Essex).

Appendices

Appendix 1: IPQ evaluation

The IPQ assesses presence in virtual realities on three different scales and allows to quantify the degree of immersion experienced by the participants within the VR. The igroup consortium provides reference data from different VR setups. We compared our data to setups that also used a head-mounted display. The reference data set comprised 24 mean values for the three scales.

Due to a software issue, only 21 of the 33 participants completed the IPQ questionnaire in our study. We checked whether the results exceeded those of the reference data. The results of the respective t-tests are shown in Table 6; the data are shown in Fig. 10.

Table 6 T-Test results for the different IPQ scales
Fig. 10
figure 10

Scores for the different IPQ scales in the observed (light gray) and the reference (dark gray) data. Significant differences (indicated with an asterisk) were found for the spatial presence scale. The scores show that the experimental setup yielded comparable immersion and realism judgments as reference setups, while spatial presence was improved

With respect to involvement and realism, the data are comparable to the reference data. In case of spatial presence, the results are significantly improved compared to the reference data [t(33.4) = 1.76, p < 0.05]. Together, the results show a sufficient degree of immersion. Improvements with respect to spatial presence dovetail with other results that showed enhanced spatial perception in VR when participants were equipped with a body model (Mohler et al. 2008), or when they could interact with the VR via bodily motion (Schroeder et al. 2016).

Appendix 2: Collected petals

To check whether the participants complied with the manual task, the amount of petals picked was subjected to a separate analysis. In general, participants complied with the task, collecting 4.5 petals on average per trial. However, there were considerable individual differences, leading to a rather high standard deviation of 1.4 petals. To further check for learning effects and effects due to the induced drift, the number of collected petals was analyzed with a 2 × 2 repeated measure ANOVA using R (R Core Team 2016) and the ez package (Lawrence 2015). We considered the factors block and offset condition. The experiment was divided into two blocks; the according factor had two levels. To allow a straightforward analysis of the different offset conditions, we aggregated over the variations in the depth axis (see Fig. 5b), such that the resulting factor had only two levels (visual offsets to the left or to the right).

The results of the analysis are shown in Table 7. There was a considerable learning effect. In the second block, participants collected significantly more petals than in the first block (M = 3.9 vs. M = 5.1). Furthermore, participants collected more petals when the hands were visually shifted to the right (M = 4.2), than when they were shifted to the left (M = 4.8). This unwanted effect is most likely due to the asymmetric layout of the scene (see Fig. 3). Visual offsets to the right were compensated by moving the hands to the left. This allows a more convenient trajectory through the task space, because the hands operate in the center of the tracking range with the flower physically slightly to the left and the basket to the right of the center of the tracking range. For visual offsets to the left, the trajectory through the task space is less convenient. In this case, the shifts are compensated by placing the hands to the right of the center of the tracking range, such that the hands had to be moved even further to the right in order to reach the basket.

Table 7 ANOVA table for the number of collected petals

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lohmann, J., Butz, M.V. Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference. Cogn Process 18, 211–228 (2017). https://doi.org/10.1007/s10339-017-0798-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10339-017-0798-5

Keywords

Navigation