Recurrent Entity Networks with Delayed Memory Update for Targeted Aspect-based Sentiment Analysis

Published:

Fei Liu, Trevor Cohn and Timothy Baldwin (2018) Recurrent Entity Networks with Delayed Memory Update for Targeted Aspect-based Sentiment Analysis. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics – Human Language Technologies, New Orleans, USA, pp. 278-283.

@InProceedings{Liu+:2018,
  author    = {Liu, Fei  and Cohn, Trevor and Baldwin, Timothy},
  title     = {Recurrent Entity Networks with Delayed Memory Update for Targeted Aspect-based Sentiment Analysis},
  booktitle = {Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies},
  year      = {2018},
  address   = {New Orleans, USA},
  pages     = {278--283}
}

Abstract

While neural networks have been shown to achieve impressive results for sentence-level sentiment analysis, targeted aspect-based sentiment analysis (TABSA) — extraction of fine-grained opinion polarity w.r.t. a pre-defined set of aspects — remains a difficult task. Motivated by recent advances in memory-augmented models for machine reading, we propose a novel architecture, utilising external “memory chains” with a delayed memory update mechanism to track entities independently. Over the task of TABSA, the proposed model demonstrates substantial improvements over state-of-the-art approaches, including those using external knowledge bases.