Abstract
This paper reports empirical findings on human performance in an experiment comprising a perceptual task and a motor task. Such findings should be considered in design of robots, since drawing inspiration from natural solutions not only should prove beneficial for artificial systems but also human-robot interaction should then become more efficient and safe. Humans have developed various mechanisms to optimize the way actions are performed and the effects they induce. Optimization of action planning (e.g., grasping, reaching or lifting objects) requires efficient selection of action-relevant features. Selection might also depend on the environmental context in which an action takes place. The present study investigated how action context influences perceptual processing in action planning. The experimental paradigm comprised two independent tasks: (1) a perceptual visual search task and (2) a grasping or a pointing movement. Reaction times in the visual search task were measured as a function of the movement type (grasping vs. pointing) and context complexity (context varying along one dimension vs. context varying along two dimensions). Results showed that action context influenced reaction times, which suggests a close bidirectional link between action and perception as well as an impact of environmental action context on perceptual selection in the course of action planning. These findings are discussed in the context of application for robotics and design of users’ interfaces.
Similar content being viewed by others
References
James W (1890) Principles of psychology. Holt, New York
Greenwald AG (1970) Sensory feedback mechanisms in performance control: with special reference to the ideo-motor mechanism. Psychol Rev 77:73–99
Wolpert DM, Kawato M (1998) Multiple paired forward and inverse models for motor control. Neural Netw 11:1317–1329
Allport A (1987) Selection for action: some behavioral and neurophysiological considerations of attention and action. In: Heuer H, Sanders AF (eds) Perspectives on perception and action. Erlbaum, Hillsdale, pp 395–419
Rizzolatti G, Craighero L (2004) The mirror-neuron system. Ann Rev Neurosci 27:169–192
Hommel B, Müsseler J, Aschersleben G, Prinz W (2001) The theory of event coding (TEC): a framework for perception and action planning. Behav Brain Sci 24:849–937
Prinz W (1997) Perception and action planning. Eur J Cogn Psychol 9:129–154
Craighero L, Fadiga L, Rizzolatti G, Umiltà CA (1999) Action for perception: a motor-visual attentional effect. J Exp Psychol Hum Percept Perform 25:1673–1692
Schubotz RI, von Cramon DY (2002) Predicting perceptual events activates corresponding motor schemes in lateral premotor cortex: an fMRI study. Neuroimage 15:787–796
Goodale MA, Milner AD (1992) Separate visual pathways for perception and action. Trends Neurosci 15:20–25
Rossetti Y, Pisella L, Vighetto A (2003) Optic ataxia revisited: visually guided action versus immediate visuomotor control. Exp Brain Res 153:171–179
Humphreys GW, Riddoch MJ (2001) Detection by action: neuropsychological evidence for action-defined templates in search. Nature Neurosci 4:84–89
Gibson EJ (1977) The theory of affordances. In: Shaw RE, Bransford J (eds) Perceiving, acting and knowing. Erlbaum, Hillsdale, pp 127–143
Tucker R, Ellis M (2001) The potentiation of grasp types during visual object categorization. Vis Cogn 8:769–800
Grèzes J, Decety J (2002) Does visual perception of object afford action? Evidence from a neuroimaging study. Neuropsychologia 40:212–222
Grafton ST, Fadiga L, Arbib MA, Rizzolatti G (1997) Premotor cortex activation during observation and naming of familiar tools. NeuroImage 6:231–236
Bekkering H, Neggers SFW (2002) Visual search is modulated by action intentions. Psychol Sci 13:370–374
Fagioli S, Hommel B, Schubotz RI (2007) Intentional control of attention: action planning primes action related stimulus dimensions. Psychol Res 71:22–29
Wykowska A, Schubö A, Hommel B (2009) How you move is what you see: action planning biases selection in visual search. J Exp Psychol Hum Percept Perform 35:1755–1769
Turner RM (1998) Context-mediated behavior for intelligent agents. Int J Hum-Comput Stud 48:307–330
Chun MM, Jiang Y (1998) Contextual cueing: implicit learning and memory of visual context guides spatial attention. Cogn Psychol 36:28–71
Duncan J, Humphreys GW (1989) Visual search and stimulus similarity. Psychol Rev 96:433–458
Kunar MA, Flusberg S, Horowitz TS, Wolfe JM (2007) Does contextual cuing guide the deployment of attention? J Exp Psychol Hum Percept Perform 33:816–828
Olson IR, Chun MM, Allison T (2001) Contextual guidance of attention. Human intracranial event-related potential evidence for feedback modulation in anatomically early, temporally late stages of visual processing. Brain 124:1417–1425
Schankin A, Schubö A (2009) Cognitive processes facilitated by contextual cueing. Evidence from event-related brain potentials. Psychophysiology 46:668–679
Schubö A, Schröger E, Meinecke C (2004) Texture segmentation and visual search for pop-out targets: an ERP study. Brain Res Cogn Brain Res 21:317–334
Schubö A, Wykowska A, Müller HJ (2007) Detecting pop-out targets in contexts of varying homogeneity: investigating homogeneity coding with event-related brain potentials (ERPs). Brain Res 1138:136–147
Bundesen C (1990) A theory of visual attention. Psychol Rev 97:523–547
Desimone R, Duncan J (1995) Neural mechanisms of selective visual attention. Ann Rev Neurosci 18:193–222
Wolfe JM (1994) Guided search 2.0: a revised model of visual search. Psychon Bull Rev 1:202–238
Horswill I (1994) Specialization of perceptual processes. PhD thesis, Massachusetts Institute of Technology, Cambridge
Okada K, Kojima M, Sagawa Y, Ichino T, Sato K, Inaba M (2006) Vision based behavior verification system of humanoid robot for daily environment tasks. In: 6th IEEE-RAS international conference on humanoid robots, Humanoids 2006, pp 7–12
Itti L, Arbib M (2006) Attention and the minimal subscene. In: Arbib MA (ed) Action to language via the mirror neuron system. Cambridge University Press, Cambridge, pp 289–346
Beetz M, Stulp F, Esden-Tempski P, Fedrizzi A, Klank U, Kresse I, Maldonado A, Ruiz-Ugalde F (2010) Generality and legibility in mobile manipulation. Auton Robots J (special issue on mobile manipulation)
Beetz M, Jain D, Mösenlechner L, Tenorth M (2010) Towards performing everyday manipulation activities. Robot Auton Syst
Beetz M, Blodow N, Klank U, Marton Z, Pangercic D, Rusu R (2009) COP-MAN perception for mobile pick-and-place in human living environments. Workshop, IROS
Klank U, Pangercic D, Rusu R, Beetz M (2009) Real-time CAD model matching for mobile manipulation and grasping. In: 9th IEEE-RAS international conference on humanoid robots, Paris, France, December 7–10
Maldonado A, Klank U, Beetz M (2010) Robotic grasping of unmodeled objects using time-of-flight range data and finger torque information. In: IEEE/RSJ international conference on intelligent robots and systems, IROS, 2010 (accepted for publication)
Cabibihan J-J, So WC, Nazar M, Ge SS (2009) Pointing gestures for a robot mediated communication interface. In: Xie M et al (eds) ICIRA 2009. LNAI, vol 5928. Springer, Berlin, pp 67–77
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wykowska, A., Maldonado, A., Beetz, M. et al. How Humans Optimize Their Interaction with the Environment: The Impact of Action Context on Human Perception. Int J of Soc Robotics 3, 223–231 (2011). https://doi.org/10.1007/s12369-010-0078-3
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-010-0078-3