Skip to main content
Log in

Wearable Visual Robots

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract:

Research work reported in the literature in wearable visual computing has used exclusively static (or non-active) cameras, making the imagery and image measurements dependent on the wearer’s posture and motions. It is assumed that the camera is pointing in a good direction to view relevant parts of the scene at best by virtue of being mounted on the wearer’s head, or at worst wholly by chance. Even when pointing in roughly the correct direction, any visual processing relying on feature correspondence from a passive camera is made more difficult by the large, uncontrolled inter-image movements which occur when the wearer moves, or even breathes. This paper presents a wearable active visual sensor which is able to achieve a level of decoupling of camera movement from the wearer’s posture and motions by a combination of inertial and visual sensor feedback and active control. The issues of sensor placement, robot kinematics and their relation to wearability are discussed. The performance of the prototype robot is evaluated for some essential visual tasks. The paper also discusses potential applications for this kind of wearable robot.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mayol, W., Tordoff, B. & Murray, D. Wearable Visual Robots . Personal Ub Comp 6, 37–48 (2002). https://doi.org/10.1007/s007790200004

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s007790200004

Navigation