TY - GEN
T1 - Integrating bi-dynamic routing capsule network with label-constraint for text classification
AU - Guo, Xiang
AU - Wang, Youquan
AU - Gao, Kaiyuan
AU - Cao, Jie
AU - Tao, Haicheng
AU - Chen, Chaoyue
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant 71701089, Grant 91646204, and the National Center for International Joint Research on E-Business Information Processing under Grant 2013B01035.
Funding Information:
VI. ACKNOWLEDGMENT This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant 71701089, Grant 91646204, and the National Center for International Joint Research on E-Business Information Processing under Grant 2013B01035.
Publisher Copyright:
© 2020 IEEE.
PY - 2020/8
Y1 - 2020/8
N2 - Neural-based text classification methods have attracted increasing attention in recent years. Unlike the standard text classification methods, neural-based text classification methods perform the representation operation and end-to-end learning on the text data. Many useful insights can be derived from neural based text classifiers as demonstrated by an ever-growing body of work focused on text mining. However, in the real-world, text can be both complex and noisy which can pose a problem for effective text classification. An effective way to deal with this issue is to incorporate self-attention and capsule networks into text mining solutions. In this paper, we propose a Bi-dynamic routing Capsule Network with Label-constraint (BCNL) model for text classification, which moves beyond the limitations of previous methods by automatically learning the task-relevant and label-relevant words of text. Specifically, we use a Bi-LSTM and self-attention with position encoder network to learn text embeddings. Meanwhile, we propose a bi-dynamic routing capsule network with label-constraint to adjust the category distribute of text capsules. Through extensive experiments on four datasets, we observe that our method outperforms state-of-the-art baseline methods.
AB - Neural-based text classification methods have attracted increasing attention in recent years. Unlike the standard text classification methods, neural-based text classification methods perform the representation operation and end-to-end learning on the text data. Many useful insights can be derived from neural based text classifiers as demonstrated by an ever-growing body of work focused on text mining. However, in the real-world, text can be both complex and noisy which can pose a problem for effective text classification. An effective way to deal with this issue is to incorporate self-attention and capsule networks into text mining solutions. In this paper, we propose a Bi-dynamic routing Capsule Network with Label-constraint (BCNL) model for text classification, which moves beyond the limitations of previous methods by automatically learning the task-relevant and label-relevant words of text. Specifically, we use a Bi-LSTM and self-attention with position encoder network to learn text embeddings. Meanwhile, we propose a bi-dynamic routing capsule network with label-constraint to adjust the category distribute of text capsules. Through extensive experiments on four datasets, we observe that our method outperforms state-of-the-art baseline methods.
KW - BiDynamic Routing
KW - Capsule Network
KW - Self-Attention
KW - Text Classification
UR - http://www.scopus.com/inward/record.url?scp=85092531149&partnerID=8YFLogxK
U2 - 10.1109/ICBK50248.2020.00011
DO - 10.1109/ICBK50248.2020.00011
M3 - Conference Proceeding
AN - SCOPUS:85092531149
T3 - Proceedings - 11th IEEE International Conference on Knowledge Graph, ICKG 2020
SP - 4
EP - 11
BT - Proceedings - 11th IEEE International Conference on Knowledge Graph, ICKG 2020
A2 - Chen, Enhong
A2 - Antoniou, Grigoris
A2 - Wu, Xindong
A2 - Kumar, Vipin
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 11th IEEE International Conference on Knowledge Graph, ICKG 2020
Y2 - 9 August 2020 through 11 August 2020
ER -