Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series

W. Wan, K. Hirasawa, Takayuki Furuzuki, J. Murata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

The implementation of weight initialization is directly related to the convergence of learning algorithms. In this paper we made a case study on the famous Mackey-Glass time series problem in order to try to find some relations between weight initialization of neural networks and pruning algorithms. The pruning algorithm used in simulations is Laplace regularizer method, that is, the backpropagation algorithm with Laplace regularizer added to the criterion function. Simulation results show that different kinds of initialization weight matrices display almost the same generalization ability when using the pruning algorithm, at least for the Mackey-Glass time series.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Pages1750-1755
Number of pages6
Volume3
Publication statusPublished - 2001
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC
Duration: 2001 Jul 152001 Jul 19

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'01)
CityWashington, DC
Period01/7/1501/7/19

Fingerprint

Time series
Neural networks
Glass
Backpropagation algorithms
Learning algorithms

Keywords

  • Mackey-Glass time series
  • Pruning methods
  • Weight initialization

ASJC Scopus subject areas

  • Software

Cite this

Wan, W., Hirasawa, K., Furuzuki, T., & Murata, J. (2001). Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 1750-1755)

Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series. / Wan, W.; Hirasawa, K.; Furuzuki, Takayuki; Murata, J.

Proceedings of the International Joint Conference on Neural Networks. Vol. 3 2001. p. 1750-1755.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wan, W, Hirasawa, K, Furuzuki, T & Murata, J 2001, Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series. in Proceedings of the International Joint Conference on Neural Networks. vol. 3, pp. 1750-1755, International Joint Conference on Neural Networks (IJCNN'01), Washington, DC, 01/7/15.
Wan W, Hirasawa K, Furuzuki T, Murata J. Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series. In Proceedings of the International Joint Conference on Neural Networks. Vol. 3. 2001. p. 1750-1755
Wan, W. ; Hirasawa, K. ; Furuzuki, Takayuki ; Murata, J. / Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series. Proceedings of the International Joint Conference on Neural Networks. Vol. 3 2001. pp. 1750-1755
@inproceedings{5531fd1eda5e44da95ca7166212be2d6,
title = "Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series",
abstract = "The implementation of weight initialization is directly related to the convergence of learning algorithms. In this paper we made a case study on the famous Mackey-Glass time series problem in order to try to find some relations between weight initialization of neural networks and pruning algorithms. The pruning algorithm used in simulations is Laplace regularizer method, that is, the backpropagation algorithm with Laplace regularizer added to the criterion function. Simulation results show that different kinds of initialization weight matrices display almost the same generalization ability when using the pruning algorithm, at least for the Mackey-Glass time series.",
keywords = "Mackey-Glass time series, Pruning methods, Weight initialization",
author = "W. Wan and K. Hirasawa and Takayuki Furuzuki and J. Murata",
year = "2001",
language = "English",
volume = "3",
pages = "1750--1755",
booktitle = "Proceedings of the International Joint Conference on Neural Networks",

}

TY - GEN

T1 - Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-Glass time series

AU - Wan, W.

AU - Hirasawa, K.

AU - Furuzuki, Takayuki

AU - Murata, J.

PY - 2001

Y1 - 2001

N2 - The implementation of weight initialization is directly related to the convergence of learning algorithms. In this paper we made a case study on the famous Mackey-Glass time series problem in order to try to find some relations between weight initialization of neural networks and pruning algorithms. The pruning algorithm used in simulations is Laplace regularizer method, that is, the backpropagation algorithm with Laplace regularizer added to the criterion function. Simulation results show that different kinds of initialization weight matrices display almost the same generalization ability when using the pruning algorithm, at least for the Mackey-Glass time series.

AB - The implementation of weight initialization is directly related to the convergence of learning algorithms. In this paper we made a case study on the famous Mackey-Glass time series problem in order to try to find some relations between weight initialization of neural networks and pruning algorithms. The pruning algorithm used in simulations is Laplace regularizer method, that is, the backpropagation algorithm with Laplace regularizer added to the criterion function. Simulation results show that different kinds of initialization weight matrices display almost the same generalization ability when using the pruning algorithm, at least for the Mackey-Glass time series.

KW - Mackey-Glass time series

KW - Pruning methods

KW - Weight initialization

UR - http://www.scopus.com/inward/record.url?scp=0034876346&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034876346&partnerID=8YFLogxK

M3 - Conference contribution

VL - 3

SP - 1750

EP - 1755

BT - Proceedings of the International Joint Conference on Neural Networks

ER -