TY - JOUR
T1 - Predictive uncertainty estimation using deep learning for soft robot multimodal sensing
AU - Ding, Ze Yang
AU - Loo, Junn Yong
AU - Baskaran, Vishnu Monn
AU - Nurzaman, Surya Girinatha
AU - Tan, Chee Pin
N1 - Funding Information:
Manuscript received September 2, 2020; accepted January 11, 2021. Date of publication February 1, 2021; date of current version February 15, 2021. This letter was recommended for publication by Associate Editor S. Coros and Editor C. Laschi upon evaluation of the reviewers’ comments. This work was supported by the Ministry of Higher Education Malaysia under Grant FRGS/1/2017/ICT02/MUSM/03/3. (Corresponding authors: Surya Girinatha Nurzaman; Chee Pin Tan.) Ze Yang Ding, Junn Yong Loo, Surya Girinatha Nurzaman, and Chee Pin Tan are with the School of Engineering and Advanced Engineering Platform, Monash University Malaysia, Selangor 47500, Malaysia (e-mail: [email protected]; [email protected]; [email protected]; [email protected]).
Publisher Copyright:
© 2016 IEEE.
Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2021/4
Y1 - 2021/4
N2 - The mechanical compliance of soft robots comes at a cost of higher uncertainty in their sensing and perception, which deteriorates the accuracy of predictive models. Predictive uncertainty, which expresses the confidence behind model predictions, is necessary to compensate for the loss of accuracy in soft robot perceptive models. Nevertheless, developing a general framework to capture uncertainties is further challenged by the complex dynamics of soft robots and the difficulties in sensorizing them. In this work, we present a predictive uncertainty estimation framework based on deep learning for soft robot multimodal sensing. We show that the framework can learn to quantify uncertainty and thus is able to express the confidence associated with the predictions during inference. Being data-driven, it is scalable to different types of soft robots and sensor modalities. We demonstrate the framework on a complex multimodal sensing task where a single flex sensor is used to predict the full-body configuration of a soft actuator, as well as the magnitude and location of external contact force. We also discuss how predictive uncertainties are critical to achieve safe learning and model interpretability in soft robotics.
AB - The mechanical compliance of soft robots comes at a cost of higher uncertainty in their sensing and perception, which deteriorates the accuracy of predictive models. Predictive uncertainty, which expresses the confidence behind model predictions, is necessary to compensate for the loss of accuracy in soft robot perceptive models. Nevertheless, developing a general framework to capture uncertainties is further challenged by the complex dynamics of soft robots and the difficulties in sensorizing them. In this work, we present a predictive uncertainty estimation framework based on deep learning for soft robot multimodal sensing. We show that the framework can learn to quantify uncertainty and thus is able to express the confidence associated with the predictions during inference. Being data-driven, it is scalable to different types of soft robots and sensor modalities. We demonstrate the framework on a complex multimodal sensing task where a single flex sensor is used to predict the full-body configuration of a soft actuator, as well as the magnitude and location of external contact force. We also discuss how predictive uncertainties are critical to achieve safe learning and model interpretability in soft robotics.
KW - control
KW - deep learning methods
KW - learning for soft robots
KW - Modeling
UR - http://www.scopus.com/inward/record.url?scp=85100796876&partnerID=8YFLogxK
U2 - 10.1109/LRA.2021.3056066
DO - 10.1109/LRA.2021.3056066
M3 - Article
AN - SCOPUS:85100796876
SN - 2377-3766
VL - 6
SP - 951
EP - 957
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
ER -