Efficient Environmental Context Prediction for Lower Limb Prostheses | IEEE Journals & Magazine | IEEE Xplore

Efficient Environmental Context Prediction for Lower Limb Prostheses


Abstract:

Environmental context prediction is important for wearable robotic applications, such as terrain-adaptive control. System efficiency is critical for wearable robots, in w...Show More

Abstract:

Environmental context prediction is important for wearable robotic applications, such as terrain-adaptive control. System efficiency is critical for wearable robots, in which system resources (e.g., processors and memory) are highly constrained. This article aims to address the system efficiency of real-time environmental context prediction for lower limb prostheses. First, we develop an uncertainty-aware frame selection strategy that can dynamically select frames according to lower limb motion and uncertainty captured by Bayesian neural networks (BNNs) for environment prediction. We further propose a dynamic Bayesian gated recurrent unit (D-BGRU) network to address the inconsistent frame rate which is a side effect of the dynamic frame selection. Second, we investigate the effects on the tradeoff between computational complexity and environment prediction accuracy of adding additional sensing modalities (e.g., GPS and an on-glasses camera) into the system. Finally, we implement and optimize our framework for embedded hardware, and evaluate the real-time inference accuracy and efficiency of classifying six types of terrains. The experiments show that our proposed frame selection strategy can reduce more than 90% of the computations without sacrificing environment prediction accuracy, and can be easily extended to the situation of multimodality fusion. We achieve around 93% prediction accuracy with less than one frame to be processed per second. Our model has 6.4 million 16-bit float numbers and takes 44 ms to process each frame on a lightweight embedded platform (NVIDIA Jetson TX2).
Page(s): 3980 - 3994
Date of Publication: 08 June 2021

ISSN Information:

Funding Agency:


References

References is not available for this document.