Use of a sparse structure to improve learning performance of recurrent neural networks

Hiromitsu Awano, Shun Nishide, Hiroaki Arie, Jun Tani, Toru Takahashi, Hiroshi G. Okuno, Tetsuya Ogata

研究成果: Conference contribution

3 被引用数 (Scopus)

抄録

The objective of our study is to find out how a sparse structure affects the performance of a recurrent neural network (RNN). Only a few existing studies have dealt with the sparse structure of RNN with learning like Back Propagation Through Time (BPTT). In this paper, we propose a RNN with sparse connection and BPTT called Multiple time scale RNN (MTRNN). Then, we investigated how sparse connection affects generalization performance and noise robustness. In the experiments using data composed of alphabetic sequences, the MTRNN showed the best generalization performance when the connection rate was 40%. We also measured sparseness of neural activity and found out that sparseness of neural activity corresponds to generalization performance. These results means that sparse connection improved learning performance and sparseness of neural activity would be used as metrics of generalization performance.

本文言語English
ホスト出版物のタイトルNeural Information Processing - 18th International Conference, ICONIP 2011, Proceedings
ページ323-331
ページ数9
PART 3
DOI
出版ステータスPublished - 2011
外部発表はい
イベント18th International Conference on Neural Information Processing, ICONIP 2011 - Shanghai, China
継続期間: 2011 11 132011 11 17

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
番号PART 3
7064 LNCS
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

Conference

Conference18th International Conference on Neural Information Processing, ICONIP 2011
CountryChina
CityShanghai
Period11/11/1311/11/17

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

フィンガープリント 「Use of a sparse structure to improve learning performance of recurrent neural networks」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル