TY - GEN
T1 - Long short-term attention
AU - Zhong, Guoqiang
AU - Lin, Xin
AU - Chen, Kang
AU - Li, Qingyang
AU - Huang, Kaizhu
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2020.
PY - 2020
Y1 - 2020
N2 - Attention is an important cognition process of humans, which helps humans concentrate on critical information during their perception and learning. However, although many machine learning models can remember information of data, they have no the attention mechanism. For example, the long short-term memory (LSTM) network is able to remember sequential information, but it cannot pay special attention to part of the sequences. In this paper, we present a novel model called long short-term attention (LSTA), which seamlessly integrates the attention mechanism into the inner cell of LSTM. More than processing long short term dependencies, LSTA can focus on important information of the sequences with the attention mechanism. Extensive experiments demonstrate that LSTA outperforms LSTM and related models on the sequence learning tasks.
AB - Attention is an important cognition process of humans, which helps humans concentrate on critical information during their perception and learning. However, although many machine learning models can remember information of data, they have no the attention mechanism. For example, the long short-term memory (LSTM) network is able to remember sequential information, but it cannot pay special attention to part of the sequences. In this paper, we present a novel model called long short-term attention (LSTA), which seamlessly integrates the attention mechanism into the inner cell of LSTM. More than processing long short term dependencies, LSTA can focus on important information of the sequences with the attention mechanism. Extensive experiments demonstrate that LSTA outperforms LSTM and related models on the sequence learning tasks.
KW - Attention mechanism
KW - Long short-term attention
KW - Long short-term memory
KW - Machine learning
KW - Sequence learning
UR - http://www.scopus.com/inward/record.url?scp=85080931436&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-39431-8_5
DO - 10.1007/978-3-030-39431-8_5
M3 - Conference Proceeding
AN - SCOPUS:85080931436
SN - 9783030394301
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 45
EP - 54
BT - Advances in Brain Inspired Cognitive Systems - 10th International Conference, BICS 2019, Proceedings
A2 - Ren, Jinchang
A2 - Hussain, Amir
A2 - Zhao, Huimin
A2 - Cai, Jun
A2 - Chen, Rongjun
A2 - Xiao, Yinyin
A2 - Huang, Kaizhu
A2 - Zheng, Jiangbin
PB - Springer
T2 - 10th International Conference on Brain Inspired Cognitive Systems, BICS 2019
Y2 - 13 July 2019 through 14 July 2019
ER -