Stacked residual recurrent neural network with word weight for text classification

Wei Cao, Anping Song, Takayuki Furuzuki

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.

Original languageEnglish
Pages (from-to)277-284
Number of pages8
JournalIAENG International Journal of Computer Science
Volume44
Issue number3
Publication statusPublished - 2017 Aug 1

    Fingerprint

Keywords

  • Long Short-Term Memory
  • Recurrent Neural Networks
  • Residual Networks
  • Text classification
  • Word weight

ASJC Scopus subject areas

  • Computer Science(all)

Cite this