Stacked residual recurrent neural network with word weight for text classification

Wei Cao, Anping Song, Takayuki Furuzuki

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.

Original languageEnglish
Pages (from-to)277-284
Number of pages8
JournalIAENG International Journal of Computer Science
Volume44
Issue number3
Publication statusPublished - 2017 Aug 1

Fingerprint

Recurrent neural networks
Network architecture
Neural networks

Keywords

  • Long Short-Term Memory
  • Recurrent Neural Networks
  • Residual Networks
  • Text classification
  • Word weight

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

Stacked residual recurrent neural network with word weight for text classification. / Cao, Wei; Song, Anping; Furuzuki, Takayuki.

In: IAENG International Journal of Computer Science, Vol. 44, No. 3, 01.08.2017, p. 277-284.

Research output: Contribution to journalArticle

@article{21f83bcc10444bd49f4b40459b4e8b58,
title = "Stacked residual recurrent neural network with word weight for text classification",
abstract = "Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word {"}awesome{"} is much more important than any other words in the sentence {"}This movie is awesome{"}. Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.",
keywords = "Long Short-Term Memory, Recurrent Neural Networks, Residual Networks, Text classification, Word weight",
author = "Wei Cao and Anping Song and Takayuki Furuzuki",
year = "2017",
month = "8",
day = "1",
language = "English",
volume = "44",
pages = "277--284",
journal = "IAENG International Journal of Computer Science",
issn = "1819-656X",
publisher = "International Association of Engineers",
number = "3",

}

TY - JOUR

T1 - Stacked residual recurrent neural network with word weight for text classification

AU - Cao, Wei

AU - Song, Anping

AU - Furuzuki, Takayuki

PY - 2017/8/1

Y1 - 2017/8/1

N2 - Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.

AB - Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.

KW - Long Short-Term Memory

KW - Recurrent Neural Networks

KW - Residual Networks

KW - Text classification

KW - Word weight

UR - http://www.scopus.com/inward/record.url?scp=85028065923&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028065923&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:85028065923

VL - 44

SP - 277

EP - 284

JO - IAENG International Journal of Computer Science

JF - IAENG International Journal of Computer Science

SN - 1819-656X

IS - 3

ER -