TY - JOUR
T1 - Dual-branch cross-dimensional self-attention-based imputation model for multivariate time series
AU - Fang, Le
AU - Xiang, Wei
AU - Zhou, Yuan
AU - Fang, Juan
AU - Chi, Lianhua
AU - Ge, Zongyuan
N1 - Publisher Copyright:
© 2023 The Authors
PY - 2023/11/4
Y1 - 2023/11/4
N2 - In real-world scenarios, partial information losses of multivariate time series degrade the time series analysis. Hence, the time series imputation technique has been adopted to compensate for the missing values. Existing methods focus on investigating temporal correlations, cross-variable correlations, and bidirectional dynamics of time series, and most of these methods rely on recurrent neural networks (RNNs) to capture temporal dependency. However, the RNN-based models suffer from the common problems of slow speed and high complexity when dealing with long-term dependency. While some self-attention-based models without any recurrent structures can tackle long-term dependency with parallel computing, they do not fully learn and utilize correlations across the temporal and cross-variable dimensions. To address the limitations of existing methods, we propose a novel so-called dual-branch cross-dimensional self-attention-based imputation (DCSAI) model for multivariate time series, which is capable of performing global and auxiliary cross-dimensional analyses when imputing the missing values. In particular, this model contains masked multi-head self-attention-based encoders aligned with auxiliary generators to obtain global and auxiliary correlations in two dimensions, and these correlations are then combined into one final representation through three weighted combinations. Extensive experiments are presented to show that our model performs better than other state-of-the-art benchmarkers on three real-world public datasets under various missing rates. Furthermore, ablation study results demonstrate the efficacy of each component of the model.
AB - In real-world scenarios, partial information losses of multivariate time series degrade the time series analysis. Hence, the time series imputation technique has been adopted to compensate for the missing values. Existing methods focus on investigating temporal correlations, cross-variable correlations, and bidirectional dynamics of time series, and most of these methods rely on recurrent neural networks (RNNs) to capture temporal dependency. However, the RNN-based models suffer from the common problems of slow speed and high complexity when dealing with long-term dependency. While some self-attention-based models without any recurrent structures can tackle long-term dependency with parallel computing, they do not fully learn and utilize correlations across the temporal and cross-variable dimensions. To address the limitations of existing methods, we propose a novel so-called dual-branch cross-dimensional self-attention-based imputation (DCSAI) model for multivariate time series, which is capable of performing global and auxiliary cross-dimensional analyses when imputing the missing values. In particular, this model contains masked multi-head self-attention-based encoders aligned with auxiliary generators to obtain global and auxiliary correlations in two dimensions, and these correlations are then combined into one final representation through three weighted combinations. Extensive experiments are presented to show that our model performs better than other state-of-the-art benchmarkers on three real-world public datasets under various missing rates. Furthermore, ablation study results demonstrate the efficacy of each component of the model.
KW - Deep learning
KW - Missing value imputation
KW - Multivariate time series
KW - Self-attention
UR - http://www.scopus.com/inward/record.url?scp=85169978373&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.110896
DO - 10.1016/j.knosys.2023.110896
M3 - Article
AN - SCOPUS:85169978373
SN - 0950-7051
VL - 279
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 110896
ER -