Abstract:
Our ability to perceive motion information on the skin is key to manipulating dynamic objects in the environment. Previous studies show that the brain derives tactile mot...Show MoreMetadata
Abstract:
Our ability to perceive motion information on the skin is key to manipulating dynamic objects in the environment. Previous studies show that the brain derives tactile motion representations by integrating local cues of the object that impinge on the skin (e.g., speed, intensity, direction), a mechanism known as the Full Vector Average model [1]. This model was derived from studies that placed the hand in the same posture. Yet, object perception and manipulation with the hand (i.e., haptics) is a highly dynamic and goal-directed function. Thus, it is key to study whether tactile motion perception is transformed by hand position, and whether these transformations depend on the reference frame in which the motion judgement is made. Here, we asked human participants to discriminate motion stimuli on the index finger in two reference frames (hand- centric vs. sternum-centric), with the hand placed in different positions. We found that human observers can systematically represent tactile motion under explicitly instructed reference frames. We further showed that tactile motion discriminations can be accurately decoded using a Bayesian generative model.
Published in: 2021 IEEE World Haptics Conference (WHC)
Date of Conference: 06-09 July 2021
Date Added to IEEE Xplore: 23 August 2021
ISBN Information: