Abstract:
Infants are able to adaptively associate auditory stimuli with visual stimuli even in their first year of life, as demonstrated by multimodal habituation studies. Differe...Show MoreMetadata
Abstract:
Infants are able to adaptively associate auditory stimuli with visual stimuli even in their first year of life, as demonstrated by multimodal habituation studies. Different from language acquisition during later developmental stages, this adaptive learning in young infants is temporary and still very much stimulus-driven. Hence, temporal aspects of environmental and social factors figure crucially in the formation of prelexical multimodal associations. Study of these associations can offer important clues regarding how semantics are bootstrapped in real-world embodied infants. In this paper, we present a neuroanatomically based embodied computational model of multimodal habituation to explore the temporal and social constraints on the learning observed in very young infants. In particular, the model is able to explain empirical results showing that auditory word stimuli must be presented synchronously with visual stimulus movement for the two to be associated.
Published in: IEEE Transactions on Autonomous Mental Development ( Volume: 3, Issue: 2, June 2011)