Elsevier

NeuroImage

Volume 24, Issue 3, 1 February 2005, Pages 928-935
NeuroImage

Distinct neural substrates for the perception of real and virtual visual worlds

https://doi.org/10.1016/j.neuroimage.2004.09.046Get rights and content

Abstract

Virtual environments have been frequently used for training and skill improvement. However, do real and virtual worlds engage the same brain states in human perceivers? We measured brain activity using functional magnetic resonance imaging (fMRI) while adults watched movie and cartoon clips, simulating real and virtual visual worlds, respectively. Relative to baselines using random static images, the medial prefrontal cortex (MPFC) and the cerebellum were activated only by movie clips of other humans. In contrast, cartoon clips of human and non-human agents activated the superior parietal lobes, while movie clips of animals also activated the superior parietal lobes. Our fMRI findings suggest that the perception of real-world humans is characterised by the involvement of MPFC and the cerebellum, most likely for on-line representation of the mental states of others, whereas the perception of virtual-world agents engages the parietal cortex in attention to actions.

Introduction

Virtual reality is increasingly used for training in a wide range of contexts. For example, virtual human agents simulated using cartoons have been used to help students learn to perform physical, procedural tasks (Rickel and Johnson, 1999). Animated agents in virtual environments have also been used for training skills that require a high level of flexible, interpersonal interactions such as psychotherapy (Beutler and Harwood, 2004). However, whether the human brain differentially perceives and interacts with agents in the real and virtual worlds has been poorly understood. Recent functional magnetic resonance imaging (fMRI) studies have shown that, when we deal with actions assumed to come from real human agents, specific brain regions, such as the medial prefrontal cortex (MPFC), show stronger activation compared with when we assume the actions come from animated agents simulated by computers (Gallagher et al., 2002, Ramnani and Miall, 2004), suggesting that specific neural substrates may be involved in discrimination between human and non-human agents.

The current study assessed whether, when we simply perceive human agents in the real world, different brain regions are engaged compared with when we perceive agents in virtual worlds. To investigate this, we used fMRI to measure brain activations when participants observed movie and cartoon clips, which presented brief sequences of actions involving humans in real-life situations (movie clips) or actions involving either human or non-human agents in virtual worlds (cartoon clips). Movies present real images (photographs of a physical environment) whereas cartoons present virtual images (a simulation on physical principles of that environment). Brain activity when watching the clips was compared with random order static images from the movie and cartoon clips to control for any differences in low level visual feature processing. Relative to the baseline with static random images, movie or cartoon clips presented continuous and coherent visual events that induced explanatory predictions of behaviour. We aimed to identify if there are neural substrates differentiating the perception of human agents in the real visual world (in movie clips) from the perception of human or non-human characters in a virtual world (in cartoons).

Section snippets

Subjects

Twelve adults (6 male; 21–41 years of age, mean 25.5) with no neurological or psychiatric history participated in this study. All participants were right-handed, had normal or corrected-to-normal vision, and were not colour blind. Informed consent was obtained from all participants prior to scanning. This study was approved by the Academic Committee of Department of Psychology, Peking University.

Stimuli and procedure

The stimuli were presented through a LCD projector onto a rear-projection screen located at a

Results

In Condition A, we recorded fMRI signals from subjects who freely viewed silent movie clips depicting real-life situations, such as human activities at a subway station or in a classroom (Fig. 1a). The contrast of movies–random static images revealed activation in bilateral middle temporal cortex (MT) and the posterior superior temporal sulcus (STS) (centred at −51, −68, 5, Z = 4.65, P < 0.03, corrected; and 51, −68, 3, Z = 4.62, P < 0.001, corrected, see Fig. 3a), and the occipital cortex

Discussion

Our functional neuroimaging findings provide important clues about the way we perceive characters within coherent successive events in real and virtual worlds. A number of common areas were activated by all the conditions with movie and cartoon clips, relative to their static image baselines. The medial occipital cortex and MT are likely engaged by the processing of low-level visual features of the moving images, such as changes in the shape, colour (Livingstone and Hubel, 1998), and motion

Acknowledgments

This work was supported by National Natural Science Foundation of China (Project 30225026 and 30328016), the Ministry of Science and Technology of China (Project 2002CCA01000), the Ministry of Education of China (02170), the Medical Research Council (UK), and the Royal Society (UK).

References (33)

  • F. Casteli et al.

    Movement and mind: a functional imaging study of perception and interpretation of complex intentional movement patterns

    NeuroImage

    (2000)
  • M.J. Farah et al.

    What causes the face inversion effect?

    J. Exp. Psychol. Hum. Percep. Perform.

    (1995)
  • C.D. Frith et al.

    Interacting minds—a biological basis

    Science

    (1999)
  • J. Grezes et al.

    The effects of learning and intention on the neural network involved in the perception of meaningless actions

    Brain

    (1999)
  • D.A. Gusnard et al.

    Searching for a baseline: functional imaging and the resting human brain

    Nat. Rev., Neurosci.

    (2001)
  • J.B. Hopfinger et al.

    The neural mechanisms of top-down attentional control

    Nat. Neurosci.

    (2000)
  • Cited by (0)

    1

    Present address: Department of Psychology, University of Minnesota, 75 East River Road, Minneapolis, MN 55455.

    View full text