Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.
|ジャーナル||IAENG International Journal of Computer Science|
|出版ステータス||Published - 2017 8月 1|
ASJC Scopus subject areas
- コンピュータ サイエンス（全般）