TY - GEN
T1 - Neural Abstractive Summarization
T2 - 3rd IEEE International Conference on Computer Communication and Artificial Intelligence, CCAI 2023
AU - Qian, Lu
AU - Zhang, Haiyang
AU - Wang, Wei
AU - Liu, Dawei
AU - Huang, Xin
N1 - Funding Information:
ACKNOWLEDGMENTS This research is partially supported by 2022 Jiangsu Science and Technology Programme (General Programme), contract number BK20221260.
Publisher Copyright:
© 2023 IEEE.
PY - 2023/8/3
Y1 - 2023/8/3
N2 - Due to the development of neural networks, abstractive summarization has received more attention than extractive one, and has gained significant progress in generating fluent and human-like summaries with novel expressions. Seq2seq has become the primary framework for abstractive summarization, employing encoder-decoder architecture based on RNNs or CNNs, and Transformers. In this paper, we focus on reviewing the neural models that are based on seq2seq framework for abstractive summarization. Moreover, we discuss some of the most effective techniques for improving seq2seq models and provide two challenging directions, i.e. generating query-based abstractive summaries and incorporating commonsense knowledge, for in-depth investigation.
AB - Due to the development of neural networks, abstractive summarization has received more attention than extractive one, and has gained significant progress in generating fluent and human-like summaries with novel expressions. Seq2seq has become the primary framework for abstractive summarization, employing encoder-decoder architecture based on RNNs or CNNs, and Transformers. In this paper, we focus on reviewing the neural models that are based on seq2seq framework for abstractive summarization. Moreover, we discuss some of the most effective techniques for improving seq2seq models and provide two challenging directions, i.e. generating query-based abstractive summaries and incorporating commonsense knowledge, for in-depth investigation.
KW - abstractive summarization
KW - neural network
KW - pre-trained models
KW - seq2seq
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=85169296943&partnerID=8YFLogxK
U2 - 10.1109/CCAI57533.2023.10201274
DO - 10.1109/CCAI57533.2023.10201274
M3 - Conference Proceeding
AN - SCOPUS:85169296943
T3 - 2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence, CCAI 2023
SP - 50
EP - 58
BT - 2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence, CCAI 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 26 May 2023 through 28 May 2023
ER -