Attention-based word embeddings using Artificial Bee Colony algorithm for aspect-level sentiment classification

Ming Zhang, Vasile Palade, Yan Wang, Zhicheng Ji

Research output: Contribution to journalArticlepeer-review

Abstract

Considering that most popular models solving aspect-level sentiment classification problems mainly focus on designing complicated neural networks to scale the importance of each word in the sentence, this paper addresses this problem from the view of semantic space. Motivated by the fact that the senses of a word can be sophisticatedly embedded into the semantic space using a distributed representation, this paper hypothesizes that each sense of a word can be represented by one or more specific dimensions, and thus the target of aspect-level sentiment classification can be simplified into searching the related dimensions for the aspects and sentiments concerned. Particularly, an Attention Vector (ATV) based on attention mechanism is designed for each aspect in terms of a specific task, which involves two sub-vectors, i.e., a Dimension Attention Vector (DATV) and a Sentiment Attention Vector (SATV). The DATV determines the significances of different dimensions based on their correlations with an aspect; and the SATV allocates weights for the attributes of words, which are decided by sentiment polarities and part-of-speech (PoS) tagging. Given a sub-dataset related to an aspect, the ATV will be optimized by an Artificial Bee Colony (ABC) algorithm with a Support Vector Machine (SVM) classifier, the objective of which is to maximize classification accuracy. Intrinsically, the DATV can reduce the ambiguity existing in polysemy, meanwhile, the SATV is an auxiliary means for the optimization of the DATV, which can help eliminate the misunderstandings caused by antonyms. Then, the optimized DATV will be applied on a Convolutional Neural Network (CNN) model via simply scaling the pretrained word embeddings as inputs (named as ATV-CNN model). Experimental results show that the ATV-CNN model can have substantial advantages when compared with state-of-the-art models.

Original languageEnglish
Pages (from-to)713-738
Number of pages26
JournalInformation Sciences
Volume545
Early online date28 Sep 2020
DOIs
Publication statusE-pub ahead of print - 28 Sep 2020

Keywords

  • artificial Bee Colony algorithm
  • Aspect-level sentiment classification
  • Attention mechanism
  • Support Vector Machine
  • Word embeddings

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Theoretical Computer Science
  • Computer Science Applications
  • Information Systems and Management
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Attention-based word embeddings using Artificial Bee Colony algorithm for aspect-level sentiment classification'. Together they form a unique fingerprint.

Cite this