TY - JOUR
T1 - DeepTIO
T2 - a deep thermal-inertial odometry with visual hallucination
AU - Saputra, Muhamad Risqi U.
AU - Trigoni, Niki
AU - De Gusmao, Pedro P.B.
AU - Lu, Chris Xiaoxuan
AU - Almalioglu, Yasin
AU - Rosa, Stefano
AU - Chen, Changhao
AU - Wahlstrom, Johan
AU - Wang, Wei
AU - Markham, Andrew
N1 - Funding Information:
Manuscript received September 10, 2019; accepted January 9, 2020. Date of publication January 24, 2020; date of current version February 7, 2020. This letter was recommended for publication by Associate Editor Prof. J. Civera and Editor Prof. S. Behnke upon evaluation of the reviewers’ comments. This work was supported in part by the National Institute of Standards and Technology (NIST) Grant “Pervasive, Accurate, and Reliable Location-Based Services for Emergency Responders” under Federal Grant 70NANB17H185. M. R. U. Saputra was supported in part by the Indonesia Endowment Fund for Education (LPDP). (Corresponding author: Muhamad Risqi U. Saputra.) The authors are with the Department of Computer Science, University of Oxford, OX1 3QD Oxford, U.K. (e-mail: muhamad.saputra@cs.ox.ac. uk; pedro.gusmao@cs.ox.ac.uk; luxiaoxuan555@hotmail.com; yasin. almalioglu@cs.ox.ac.uk; stefano.rosa@polito.it; changhao.chen@cs.ox.ac.uk; johan.wahlstrom@cs.ox.ac.uk; wei.wang@cs.ox.ac.uk; andrew.markham@ comlab.ox.ac.uk; Niki.Trigoni@comlab.ox.ac.uk).
Publisher Copyright:
© 2016 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/4
Y1 - 2020/4
N2 - Visual odometry shows excellent performance in a wide range of environments. However, in visually-denied scenarios (e.g. heavy smoke or darkness), pose estimates degrade or even fail. Thermal cameras are commonly used for perception and inspection when the environment has low visibility. However, their use in odometry estimation is hampered by the lack of robust visual features. In part, this is as a result of the sensor measuring the ambient temperature profile rather than scene appearance and geometry. To overcome this issue, we propose a Deep Neural Network model for thermal-inertial odometry (DeepTIO) by incorporating a visual hallucination network to provide the thermal network with complementary information. The hallucination network is taught to predict fake visual features from thermal images by using Huber loss. We also employ selective fusion to attentively fuse the features from three different modalities, i.e thermal, hallucination, and inertial features. Extensive experiments are performed in hand-held and mobile robot data in benign and smoke-filled environments, showing the efficacy of the proposed model.
AB - Visual odometry shows excellent performance in a wide range of environments. However, in visually-denied scenarios (e.g. heavy smoke or darkness), pose estimates degrade or even fail. Thermal cameras are commonly used for perception and inspection when the environment has low visibility. However, their use in odometry estimation is hampered by the lack of robust visual features. In part, this is as a result of the sensor measuring the ambient temperature profile rather than scene appearance and geometry. To overcome this issue, we propose a Deep Neural Network model for thermal-inertial odometry (DeepTIO) by incorporating a visual hallucination network to provide the thermal network with complementary information. The hallucination network is taught to predict fake visual features from thermal images by using Huber loss. We also employ selective fusion to attentively fuse the features from three different modalities, i.e thermal, hallucination, and inertial features. Extensive experiments are performed in hand-held and mobile robot data in benign and smoke-filled environments, showing the efficacy of the proposed model.
KW - deep learning in robotics and automation
KW - Localization
KW - sensor fusion
KW - thermal-inertial odometry
UR - http://www.scopus.com/inward/record.url?scp=85079768302&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.2969170
DO - 10.1109/LRA.2020.2969170
M3 - Article
AN - SCOPUS:85079768302
VL - 5
SP - 1672
EP - 1679
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
SN - 2377-3766
IS - 2
ER -