Abstract
Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.
Original language | English |
---|---|
Pages (from-to) | 277-284 |
Number of pages | 8 |
Journal | IAENG International Journal of Computer Science |
Volume | 44 |
Issue number | 3 |
Publication status | Published - 2017 Aug 1 |
Keywords
- Long Short-Term Memory
- Recurrent Neural Networks
- Residual Networks
- Text classification
- Word weight
ASJC Scopus subject areas
- Computer Science(all)