Abstract:
In this paper, a nonlinear signal-processing scheme is developed for robotic systems that exploits a joint state-parameter formulation for simultaneous recursive estimati...Show MoreMetadata
Abstract:
In this paper, a nonlinear signal-processing scheme is developed for robotic systems that exploits a joint state-parameter formulation for simultaneous recursive estimation of the states (e.g. joint angles and rates) and uncertain parameters (e.g. inertial and friction parameters), out of noisy measurements (e.g. joint angles). Unscented Kalman filtering was employed to overcome restrictions such as linearity in the parameters and the need for availability of joint velocities and accelerations (present in linear recursive least square methods), and the linearization problems associated with extended Kalman filtering. Owing to the unscented transform concept which requires only input-output evaluations of the dynamic model, a more general and modular implementation is realizable. This allows for the utilization of computational modeling tools without the requirement of symbolically manipulating or deriving the equations of motion. Also, the recursive nature of the scheme allows for both offline processing and online implementation. The practical performance of the proposed scheme was verified through an experiment involving a five-bar linkage based haptic device configured to render a virtual box. The torque pair commands generated by the haptic controller to render the virtual box and the encoder angular measurements acquired through the experiment were processed twice in two different input-output directions: once, for state-parameter estimation of the robot; and, another time for identification of supposedly unknown environmental parameters. Results demonstrate successfulness of the scheme for recursive state-parameter estimation of the robot and the environment, as well as promising applicability in online settings.
Date of Conference: 25-30 September 2011
Date Added to IEEE Xplore: 05 December 2011
ISBN Information: