Improving generalization ability of universal learning networks with superfluous parameters

Min Han*, Kotaro Hirasawa, Jinglu Hu, Junichi Murata, Chun zhi Jin

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Citation (Scopus)

Abstract

The parameters in large scale neural networks can be divided into two classes. One class is necessary for a certain purpose while another class is not directly needed. The parameters in the latter are defined as superfluous parameters. How to use these superfluous parameters effectively is an interesting subject. In this paper, it is studied how the generalization ability of dynamic systems can be improved by use of network's superfluous parameters. And a calculation technique is proposed which use second order derivatives of the criterion function with respect to superfluous parameters. So as to investigate the effectiveness of the proposed method, simulations of modeling a nonlinear robot dynamics system is studied. Simulation results show that the proposed method is useful for improving the generalization ability of neural networks, which may model nonlinear dynamic systems.

Original languageEnglish
Pages (from-to)V-407 - V-412
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Volume5
Publication statusPublished - 1999
Externally publishedYes
Event1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn
Duration: 1999 Oct 121999 Oct 15

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Improving generalization ability of universal learning networks with superfluous parameters'. Together they form a unique fingerprint.

Cite this