Abstract:
This paper presents a new shared-control approach for assistive mobile robots, using Brain Computer Interface (BCI) as the Human-Machine Interface (HMI). A P300-based par...Show MoreMetadata
Abstract:
This paper presents a new shared-control approach for assistive mobile robots, using Brain Computer Interface (BCI) as the Human-Machine Interface (HMI). A P300-based paradigm that allows the selection of brain-actuated commands to steer a Robotic Wheelchair (RW), is proposed. At least one specific motor skill, such as the control of arms, legs, head or voice, is required to operate a conventional HMI. Due to this reason, they are not suited for people suffering from severe motor disorders. BCI may open a new communication channel to these users, since it does not require any muscular activity. The number of decoded symbols per minute (SPM) in a BCI is still very low, which means that users can only provide sparse, and discrete commands. The RW must rely on the navigation system to validate user commands effectively. A two-layer shared-control approach is proposed. The first, a virtual-constraint layer, is responsible for enabling/disabling the user commands, based on certain context restrictions. The second layer is an user-intent matching responsible for determining the suitable steering command, that better fits the user command, taking the user competence on steering the wheelchair into account. Experimental results using Robchair, the RW platform developed at ISR-UC [1], [2] are presented, showing the effectiveness of the proposed methodologies.
Date of Conference: 25-30 September 2011
Date Added to IEEE Xplore: 05 December 2011
ISBN Information: