Enhancing the generalization ability of neural networks through controlling the hidden layers

Weishui Wan, Shingo Mabu, Kaoru Shimada, Kotaro Hirasawa, Takayuki Furuzuki

研究成果: Article

28 引用 (Scopus)

抄録

In this paper we proposed two new variants of backpropagation algorithm. The common point of these two new algorithms is that the outputs of nodes in the hidden layers are controlled with the aim to solve the moving target problem and the distributed weights problem. One algorithm (AlgoRobust) is not so insensitive to the noises in the data, the second one (AlgoGS) is through using Gauss-Schmidt algorithm to determine in each epoch which weight should be updated, while the other weights are kept unchanged in this epoch. In this way a better generalization can be obtained. Some theoretical explanations are also provided. In addition, simulation comparisons are made between Gaussian regularizer, optimal brain damage (OBD) and the proposed algorithms. Simulation results confirm that the new proposed algorithms perform better than that of Gaussian regularizer, and the first algorithm AlgoRobust performs better than the second algorithm AlgoGS in the noisy data. On the other hand AlgoGS performs better than the AlgoRobust on the data without noise and the final structure obtained by two new algorithms is comparable to that obtained by using OBD.

元の言語English
ページ(範囲)404-414
ページ数11
ジャーナルApplied Soft Computing Journal
9
発行部数1
DOI
出版物ステータスPublished - 2009 1

Fingerprint

Neural networks
Brain
Backpropagation algorithms

ASJC Scopus subject areas

  • Software

これを引用

Enhancing the generalization ability of neural networks through controlling the hidden layers. / Wan, Weishui; Mabu, Shingo; Shimada, Kaoru; Hirasawa, Kotaro; Furuzuki, Takayuki.

:: Applied Soft Computing Journal, 巻 9, 番号 1, 01.2009, p. 404-414.

研究成果: Article

Wan, Weishui ; Mabu, Shingo ; Shimada, Kaoru ; Hirasawa, Kotaro ; Furuzuki, Takayuki. / Enhancing the generalization ability of neural networks through controlling the hidden layers. :: Applied Soft Computing Journal. 2009 ; 巻 9, 番号 1. pp. 404-414.
@article{e9140a88fb174d238e5c0320ef93f977,
title = "Enhancing the generalization ability of neural networks through controlling the hidden layers",
abstract = "In this paper we proposed two new variants of backpropagation algorithm. The common point of these two new algorithms is that the outputs of nodes in the hidden layers are controlled with the aim to solve the moving target problem and the distributed weights problem. One algorithm (AlgoRobust) is not so insensitive to the noises in the data, the second one (AlgoGS) is through using Gauss-Schmidt algorithm to determine in each epoch which weight should be updated, while the other weights are kept unchanged in this epoch. In this way a better generalization can be obtained. Some theoretical explanations are also provided. In addition, simulation comparisons are made between Gaussian regularizer, optimal brain damage (OBD) and the proposed algorithms. Simulation results confirm that the new proposed algorithms perform better than that of Gaussian regularizer, and the first algorithm AlgoRobust performs better than the second algorithm AlgoGS in the noisy data. On the other hand AlgoGS performs better than the AlgoRobust on the data without noise and the final structure obtained by two new algorithms is comparable to that obtained by using OBD.",
keywords = "Gaussian-Schmidt algorithm, Generalization, Optimal brain damage (OBD), Regularizer, Universal learning networks",
author = "Weishui Wan and Shingo Mabu and Kaoru Shimada and Kotaro Hirasawa and Takayuki Furuzuki",
year = "2009",
month = "1",
doi = "10.1016/j.asoc.2008.01.013",
language = "English",
volume = "9",
pages = "404--414",
journal = "Applied Soft Computing",
issn = "1568-4946",
publisher = "Elsevier BV",
number = "1",

}

TY - JOUR

T1 - Enhancing the generalization ability of neural networks through controlling the hidden layers

AU - Wan, Weishui

AU - Mabu, Shingo

AU - Shimada, Kaoru

AU - Hirasawa, Kotaro

AU - Furuzuki, Takayuki

PY - 2009/1

Y1 - 2009/1

N2 - In this paper we proposed two new variants of backpropagation algorithm. The common point of these two new algorithms is that the outputs of nodes in the hidden layers are controlled with the aim to solve the moving target problem and the distributed weights problem. One algorithm (AlgoRobust) is not so insensitive to the noises in the data, the second one (AlgoGS) is through using Gauss-Schmidt algorithm to determine in each epoch which weight should be updated, while the other weights are kept unchanged in this epoch. In this way a better generalization can be obtained. Some theoretical explanations are also provided. In addition, simulation comparisons are made between Gaussian regularizer, optimal brain damage (OBD) and the proposed algorithms. Simulation results confirm that the new proposed algorithms perform better than that of Gaussian regularizer, and the first algorithm AlgoRobust performs better than the second algorithm AlgoGS in the noisy data. On the other hand AlgoGS performs better than the AlgoRobust on the data without noise and the final structure obtained by two new algorithms is comparable to that obtained by using OBD.

AB - In this paper we proposed two new variants of backpropagation algorithm. The common point of these two new algorithms is that the outputs of nodes in the hidden layers are controlled with the aim to solve the moving target problem and the distributed weights problem. One algorithm (AlgoRobust) is not so insensitive to the noises in the data, the second one (AlgoGS) is through using Gauss-Schmidt algorithm to determine in each epoch which weight should be updated, while the other weights are kept unchanged in this epoch. In this way a better generalization can be obtained. Some theoretical explanations are also provided. In addition, simulation comparisons are made between Gaussian regularizer, optimal brain damage (OBD) and the proposed algorithms. Simulation results confirm that the new proposed algorithms perform better than that of Gaussian regularizer, and the first algorithm AlgoRobust performs better than the second algorithm AlgoGS in the noisy data. On the other hand AlgoGS performs better than the AlgoRobust on the data without noise and the final structure obtained by two new algorithms is comparable to that obtained by using OBD.

KW - Gaussian-Schmidt algorithm

KW - Generalization

KW - Optimal brain damage (OBD)

KW - Regularizer

KW - Universal learning networks

UR - http://www.scopus.com/inward/record.url?scp=53749105298&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=53749105298&partnerID=8YFLogxK

U2 - 10.1016/j.asoc.2008.01.013

DO - 10.1016/j.asoc.2008.01.013

M3 - Article

AN - SCOPUS:53749105298

VL - 9

SP - 404

EP - 414

JO - Applied Soft Computing

JF - Applied Soft Computing

SN - 1568-4946

IS - 1

ER -