Long Memory Gated Recurrent Unit for Time Series Classification

  • Binjie Hong
  • , Zhijie Yan
  • , Yingxi Chen
  • , Xiaobo Jin*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Time series analysis is an important and challenging problem in data mining, where time series is a class of temporal data objects. In the classification task, the label is dependent on the features from the last moments. Due to the time dependency, the recurrent neural networks, as one of the prevalent learning-based architectures, take advantage of the relation among history data. The Long Short-Term Memory Network (LSTM) and Gated Recurrent Unit (GRU) are two popular artificial recurrent neural networks used in the field of deep learning. LSTM designed a gate-like method to control the short and long historical information, and GRU simplified those gates to obtain more efficient training. In our work, we propose a new model called as Long Memory Gated Recurrent Unit (LMGRU) based on such two remarkable models, where the reset gate is introduced to reset the stored value of the cell in Long Short-Term Memory (LSTM) model but the forget gate and the input gate are omitted. The experimental results on several time series benchmarks show that LMGRU achieves better effectiveness and efficiency than LSTM and GRU.

Original languageEnglish
Article number012017
JournalJournal of Physics: Conference Series
Volume2278
Issue number1
DOIs
Publication statusPublished - 1 Jun 2022

Fingerprint

Dive into the research topics of 'Long Memory Gated Recurrent Unit for Time Series Classification'. Together they form a unique fingerprint.

Cite this