Abstract:
In this paper we address the problem of multi-robot localization with a heterogeneous team of low-cost mobile robots. The team consists of a single centralized observer w...Show MoreMetadata
Abstract:
In this paper we address the problem of multi-robot localization with a heterogeneous team of low-cost mobile robots. The team consists of a single centralized observer with an inertial measurement unit (IMU) and monocular camera, and multiple picket robots with only IMUs and Red Green Blue (RGB) light emitting diodes (LED). This team cooperatively navigates a visually featureless environment while localizing all robots. A combination of camera imagery captured by the observer and IMU measurements from the pickets and observer are fused to estimate motion of the team. A team movement strategy, referred to as inchworm, is formulated as follows: Pickets move ahead of the observer and then act as temporary landmarks for the observer to follow. This cooperative approach employs a single Extended Kalman Filter (EKF) to localize the entire heterogeneous multi-robot team, using a formulation of the measurement Jacobian to relate the pose of the observer to the poses of the pickets with respect to the global reference frame. An initial experiment with the inchworm strategy has shown localization within 0.14 m position error and 2.18° orientation error over a path-length of 5 meters in an environment with irregular ground, partial occlusions, and a ramp. This demonstrates improvement over a camera-only localization technique that was adapted to our team dynamic which produced 0.18m position error and 3.12° orientation error over the same dataset. In addition, we demonstrate improvement in localization accuracy with an increasing number of picket robots.
Date of Conference: 29 May 2017 - 03 June 2017
Date Added to IEEE Xplore: 24 July 2017
ISBN Information: