Enhancing the generalization ability of neural networks by using gram-schmidt orthogonalization algorithm

W. Wan, K. Hirasawa*, J. Hu, J. Murata

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

2 Citations (Scopus)

Abstract

Generalization ability of neural networks is the most important criterion to determine whether one algorithm is powerful or not. Many new algorithms have been devised to enhance the generalization ability of neural networks[1][2]. In this paper a new algorithm using the Gram-Schmidt orthogonalization algorithm [3] to the outputs of nodes in the hidden layers is proposed with the aim to reduce the interference among the nodes in the hidden layers, which is much more efficient than the regularizers methods. Simulation results confirm the above assertion.

Original languageEnglish
Pages1721-1726
Number of pages6
Publication statusPublished - 2001 Jan 1
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: 2001 Jul 152001 Jul 19

Conference

ConferenceInternational Joint Conference on Neural Networks (IJCNN'01)
Country/TerritoryUnited States
CityWashington, DC
Period01/7/1501/7/19

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Enhancing the generalization ability of neural networks by using gram-schmidt orthogonalization algorithm'. Together they form a unique fingerprint.

Cite this