MPNet-GRUs: Sentiment Analysis With Masked and Permuted Pre-Training for Language Understanding and Gated Recurrent Units

Nicole Kai Ning Loh, Chin Poo Lee*, Thian Song Ong, Kian Ming Lim

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Sentiment analysis, a pivotal task in natural language processing, aims to discern opinions and emotions expressed in text. However, existing methods for sentiment analysis face various challenges such as data scarcity, complex language patterns, and long-range dependencies. In this paper, we propose MPNet-GRUs, a hybrid deep learning model that integrates three key components: MPNet, BiGRU, and GRU. MPNet, a transformer-based pre-trained language model, enhances language understanding through masked and permuted pre-training. BiGRU and GRU, recurrent neural networks, capture long-term dependencies bidirectionally and unidirectionally. By combining the strengths of these models, MPNet-GRUs aims to provide a more effective and efficient solution for sentiment analysis. Evaluation on three benchmark datasets reveals the superior performance of MPNet-GRUs: 94.71% for IMDb, 86.27% for Twitter US Airline Sentiment, and 88.17% for Sentiment140, demonstrating its potential to advance sentiment analysis.

Original languageEnglish
Article number10510290
Pages (from-to)74069-74080
Number of pages12
JournalIEEE Access
Volume12
DOIs
Publication statusPublished - 2024
Externally publishedYes

Keywords

  • BiGRU
  • GRU
  • MPNet
  • sentiment
  • sentiment analysis
  • transformer

Fingerprint

Dive into the research topics of 'MPNet-GRUs: Sentiment Analysis With Masked and Permuted Pre-Training for Language Understanding and Gated Recurrent Units'. Together they form a unique fingerprint.

Cite this