RoBERTa-GRU: A Hybrid Deep Learning Model for Enhanced Sentiment Analysis

Kian Long Tan, Chin Poo Lee*, Kian Ming Lim

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Citations (Scopus)

Abstract

This paper proposes a novel hybrid model for sentiment analysis. The model leverages the strengths of both the Transformer model, represented by the Robustly Optimized BERT Pretraining Approach (RoBERTa), and the Recurrent Neural Network, represented by Gated Recurrent Units (GRU). The RoBERTa model provides the capability to project the texts into a discriminative embedding space through its attention mechanism, while the GRU model captures the long-range dependencies of the embedding and addresses the vanishing gradients problem. To overcome the challenge of imbalanced datasets in sentiment analysis, this paper also proposes the use of data augmentation with word embeddings by over-sampling the minority classes. This enhances the representation capacity of the model, making it more robust and accurate in handling the sentiment classification task. The proposed RoBERTa-GRU model was evaluated on three widely used sentiment analysis datasets: IMDb, Sentiment140, and Twitter US Airline Sentiment. The results show that the model achieved an accuracy of 94.63% on IMDb, 89.59% on Sentiment140, and 91.52% on Twitter US Airline Sentiment. These results demonstrate the effectiveness of the proposed RoBERTa-GRU hybrid model in sentiment analysis.

Original languageEnglish
Article number3915
JournalApplied Sciences (Switzerland)
Volume13
Issue number6
DOIs
Publication statusPublished - Mar 2023
Externally publishedYes

Keywords

  • deep learning
  • GRU
  • RoBERTa
  • sentiment analysis
  • Transformer

Fingerprint

Dive into the research topics of 'RoBERTa-GRU: A Hybrid Deep Learning Model for Enhanced Sentiment Analysis'. Together they form a unique fingerprint.

Cite this