TY - JOUR
T1 - Leveraging sensory knowledge into Text-to-Text Transfer Transformer for enhanced emotion analysis
AU - Zhao, Qingqing
AU - Xia, Yuhan
AU - Long, Yunfei
AU - Xu, Ge
AU - Wang, Jia
N1 - Publisher Copyright:
© 2024 The Author(s)
PY - 2025/1
Y1 - 2025/1
N2 - This study proposes an innovative model (i.e., SensoryT5), which integrates sensory knowledge into the T5 (Text-to-Text Transfer Transformer) framework for emotion classification tasks. By embedding sensory knowledge within the T5 model's attention mechanism, SensoryT5 not only enhances the model's contextual understanding but also elevates its sensitivity to the nuanced interplay between sensory information and emotional states. Experiments on four emotion classification datasets, three sarcasm classification datasets one subjectivity analysis dataset, and one opinion classification dataset (ranging from binary to 32-class tasks) demonstrate that our model outperforms state-of-the-art baseline models (including the baseline T5 model) significantly. Specifically, SensoryT5 achieves a maximal improvement of 3.0% in both the accuracy and the F1 score for emotion classification. In sarcasm classification tasks, the model surpasses the baseline models by the maximal increase of 1.2% in accuracy and 1.1% in the F1 score. Furthermore, SensoryT5 continues to demonstrate its superior performances for both subjectivity analysis and opinion classification, with increases in ACC and the F1 score by 0.6% for the subjectivity analysis task and increases in ACC by 0.4% and the F1 score by 0.6% for the opinion classification task, when compared to the second-best models. These improvements underscore the significant potential of leveraging cognitive resources to deepen NLP models’ comprehension of emotional nuances and suggest an interdisciplinary research between the areas of NLP and neuro-cognitive science.
AB - This study proposes an innovative model (i.e., SensoryT5), which integrates sensory knowledge into the T5 (Text-to-Text Transfer Transformer) framework for emotion classification tasks. By embedding sensory knowledge within the T5 model's attention mechanism, SensoryT5 not only enhances the model's contextual understanding but also elevates its sensitivity to the nuanced interplay between sensory information and emotional states. Experiments on four emotion classification datasets, three sarcasm classification datasets one subjectivity analysis dataset, and one opinion classification dataset (ranging from binary to 32-class tasks) demonstrate that our model outperforms state-of-the-art baseline models (including the baseline T5 model) significantly. Specifically, SensoryT5 achieves a maximal improvement of 3.0% in both the accuracy and the F1 score for emotion classification. In sarcasm classification tasks, the model surpasses the baseline models by the maximal increase of 1.2% in accuracy and 1.1% in the F1 score. Furthermore, SensoryT5 continues to demonstrate its superior performances for both subjectivity analysis and opinion classification, with increases in ACC and the F1 score by 0.6% for the subjectivity analysis task and increases in ACC by 0.4% and the F1 score by 0.6% for the opinion classification task, when compared to the second-best models. These improvements underscore the significant potential of leveraging cognitive resources to deepen NLP models’ comprehension of emotional nuances and suggest an interdisciplinary research between the areas of NLP and neuro-cognitive science.
KW - Attention mechanism
KW - Emotion analysis
KW - Pre-trained language model
KW - Sensory knowledge
UR - http://www.scopus.com/inward/record.url?scp=85203016181&partnerID=8YFLogxK
U2 - 10.1016/j.ipm.2024.103876
DO - 10.1016/j.ipm.2024.103876
M3 - Article
AN - SCOPUS:85203016181
SN - 0306-4573
VL - 62
JO - Information Processing and Management
JF - Information Processing and Management
IS - 1
M1 - 103876
ER -