TY - GEN
T1 - Representation of Words over Vectors in Recurrent Convolutional Attention Architecture for Sentiment Analysis
AU - Abid, Fazeel
AU - Chenli,
AU - Alam, Muhammad
AU - Abid, Adnan
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Subjectivity in the text concerns to sentiment analysis, which entails Natural language processing (NLP) techniques to detain the casual way of communication. Many NLP undertakings used to capture the sentimental contextual information through distributed representations. Neural networks (NNs) like recurrent and convolutional neural networks have productively attained the remarkable outcomes in text classification. However, few of architectures learning through vector representation of words in the recurrent network for capturing long-term dependencies while in the convolutional for feature extraction along with pooling layer; a max-pooling task for dimensional reduction has been proposed, still, unable to seizure enough syntactic and semantic regularities for sentiment analysis. This paper empowers sentiment analysis by exploring unsupervised learning of vectors representation such as Word2Vec and GloVe, then acquired word vectors is input to the proposed architecture which is composed of RNN and CNN engaging with attention-mechanism referred to Recurrent Convolutional Attention Architecture 'RCAA.' Experimentations show that through unsupervised learning of representation of words into qualified vectors in interpreting the similarities related to syntactic and semantic context and the sentiments at adequate computational cost along with the combination neural architecture by succeeding accuracy on word2vec by 83.62%, GloVe by 85.72% as compared with Random initialization by 79.97% on rotten tomatoes test dataset respectively.
AB - Subjectivity in the text concerns to sentiment analysis, which entails Natural language processing (NLP) techniques to detain the casual way of communication. Many NLP undertakings used to capture the sentimental contextual information through distributed representations. Neural networks (NNs) like recurrent and convolutional neural networks have productively attained the remarkable outcomes in text classification. However, few of architectures learning through vector representation of words in the recurrent network for capturing long-term dependencies while in the convolutional for feature extraction along with pooling layer; a max-pooling task for dimensional reduction has been proposed, still, unable to seizure enough syntactic and semantic regularities for sentiment analysis. This paper empowers sentiment analysis by exploring unsupervised learning of vectors representation such as Word2Vec and GloVe, then acquired word vectors is input to the proposed architecture which is composed of RNN and CNN engaging with attention-mechanism referred to Recurrent Convolutional Attention Architecture 'RCAA.' Experimentations show that through unsupervised learning of representation of words into qualified vectors in interpreting the similarities related to syntactic and semantic context and the sentiments at adequate computational cost along with the combination neural architecture by succeeding accuracy on word2vec by 83.62%, GloVe by 85.72% as compared with Random initialization by 79.97% on rotten tomatoes test dataset respectively.
KW - Distributed Representation
KW - GloVe
KW - RCAA
KW - Recurrent
KW - Sentiment Analysis
KW - Word2Vec
KW - and Convolutional Neural Network
UR - http://www.scopus.com/inward/record.url?scp=85079239618&partnerID=8YFLogxK
U2 - 10.1109/ICIC48496.2019.8966730
DO - 10.1109/ICIC48496.2019.8966730
M3 - Conference Proceeding
AN - SCOPUS:85079239618
T3 - 3rd International Conference on Innovative Computing, ICIC 2019
BT - 3rd International Conference on Innovative Computing, ICIC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 3rd International Conference on Innovative Computing, ICIC 2019
Y2 - 1 November 2019 through 2 November 2019
ER -