TY - JOUR
T1 - Explainable Tensorized Neural Ordinary Differential Equations for Arbitrary-Step Time Series Prediction
AU - Gao, Penglei
AU - Yang, Xi
AU - Zhang, Rui
AU - Huang, Kaizhu
AU - Goulermas, John Y.
N1 - Publisher Copyright:
IEEE
PY - 2022/6/1
Y1 - 2022/6/1
N2 - In this work, we propose a continuous neural network architecture, referred to as Explainable Tensorized Neural - Ordinary Differential Equations (ETN-ODE) network for multi-step time series prediction at arbitrary time points. Unlike existing approaches which mainly handle univariate time series for multi-step prediction, or multivariate time series for single-step predictions, ETN-ODE is capable of handling multivariate time series with arbitrary-step predictions. An additional benefit is its tandem attention mechanism, with respect to temporal and variable attention, which enable it to greatly facilitate data interpretability. Specifically, the proposed model combines an explainable tensorized gated recurrent unit with ordinary differential equations, with the derivatives of the latent states parameterized through a neural network. We quantitatively and qualitatively demonstrate the effectiveness and interpretability of ETN-ODE on one arbitrary-step prediction task and five standard multi-step prediction tasks. Extensive experiments show that the proposed method achieves very accurate predictions at arbitrary time points while attaining very competitive performance against the baseline methods in standard multi-step time series prediction.
AB - In this work, we propose a continuous neural network architecture, referred to as Explainable Tensorized Neural - Ordinary Differential Equations (ETN-ODE) network for multi-step time series prediction at arbitrary time points. Unlike existing approaches which mainly handle univariate time series for multi-step prediction, or multivariate time series for single-step predictions, ETN-ODE is capable of handling multivariate time series with arbitrary-step predictions. An additional benefit is its tandem attention mechanism, with respect to temporal and variable attention, which enable it to greatly facilitate data interpretability. Specifically, the proposed model combines an explainable tensorized gated recurrent unit with ordinary differential equations, with the derivatives of the latent states parameterized through a neural network. We quantitatively and qualitatively demonstrate the effectiveness and interpretability of ETN-ODE on one arbitrary-step prediction task and five standard multi-step prediction tasks. Extensive experiments show that the proposed method achieves very accurate predictions at arbitrary time points while attaining very competitive performance against the baseline methods in standard multi-step time series prediction.
KW - ODEs
KW - Time series prediction
KW - neural networks
KW - tensorized GRU
UR - http://www.scopus.com/inward/record.url?scp=85128664573&partnerID=8YFLogxK
U2 - 10.1109/TKDE.2022.3167536
DO - 10.1109/TKDE.2022.3167536
M3 - Article
AN - SCOPUS:85128664573
SN - 1041-4347
VL - 35
SP - 5837
EP - 5850
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 6
ER -