Use of a sparse structure to improve learning performance of recurrent neural networks

Hiromitsu Awano*, Shun Nishide, Hiroaki Arie, Jun Tani, Toru Takahashi, Hiroshi G. Okuno, Tetsuya Ogata

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

The objective of our study is to find out how a sparse structure affects the performance of a recurrent neural network (RNN). Only a few existing studies have dealt with the sparse structure of RNN with learning like Back Propagation Through Time (BPTT). In this paper, we propose a RNN with sparse connection and BPTT called Multiple time scale RNN (MTRNN). Then, we investigated how sparse connection affects generalization performance and noise robustness. In the experiments using data composed of alphabetic sequences, the MTRNN showed the best generalization performance when the connection rate was 40%. We also measured sparseness of neural activity and found out that sparseness of neural activity corresponds to generalization performance. These results means that sparse connection improved learning performance and sparseness of neural activity would be used as metrics of generalization performance.

Original languageEnglish
Title of host publicationNeural Information Processing - 18th International Conference, ICONIP 2011, Proceedings
Pages323-331
Number of pages9
EditionPART 3
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event18th International Conference on Neural Information Processing, ICONIP 2011 - Shanghai, China
Duration: 2011 Nov 132011 Nov 17

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 3
Volume7064 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference18th International Conference on Neural Information Processing, ICONIP 2011
Country/TerritoryChina
CityShanghai
Period11/11/1311/11/17

Keywords

  • Recurrent Neural Networks
  • Sparse Coding
  • Sparse Structure

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Use of a sparse structure to improve learning performance of recurrent neural networks'. Together they form a unique fingerprint.

Cite this