New method to prune the neural network

Weishui Wan, Kotaro Hirasawa, Takayuki Furuzuki, Chunzhi Jin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Using backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. But its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional relations between the input and output data, so in this aspect some kinds of structure optimization are needed to improve its poor performance. In this paper with this in mind a new method to prune neural networks is proposed based on some statistical quantities of neural networks. Comparing with the other known pruning methods such as structural learning with forgetting (SLF) and RPROP algorithm, the proposed method can attain comparable or even better results over these methods without evident increase of the computational load. Detailed simulations using the Iris data sets exhibit our above assertion.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages449-454
Number of pages6
Volume6
Publication statusPublished - 2000
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'2000) - Como, Italy
Duration: 2000 Jul 242000 Jul 27

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'2000)
CityComo, Italy
Period00/7/2400/7/27

Fingerprint

Backpropagation algorithms
Neural networks

ASJC Scopus subject areas

  • Software

Cite this

Wan, W., Hirasawa, K., Furuzuki, T., & Jin, C. (2000). New method to prune the neural network. In Proceedings of the International Joint Conference on Neural Networks (Vol. 6, pp. 449-454). Piscataway, NJ, United States: IEEE.

New method to prune the neural network. / Wan, Weishui; Hirasawa, Kotaro; Furuzuki, Takayuki; Jin, Chunzhi.

Proceedings of the International Joint Conference on Neural Networks. Vol. 6 Piscataway, NJ, United States : IEEE, 2000. p. 449-454.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wan, W, Hirasawa, K, Furuzuki, T & Jin, C 2000, New method to prune the neural network. in Proceedings of the International Joint Conference on Neural Networks. vol. 6, IEEE, Piscataway, NJ, United States, pp. 449-454, International Joint Conference on Neural Networks (IJCNN'2000), Como, Italy, 00/7/24.
Wan W, Hirasawa K, Furuzuki T, Jin C. New method to prune the neural network. In Proceedings of the International Joint Conference on Neural Networks. Vol. 6. Piscataway, NJ, United States: IEEE. 2000. p. 449-454
Wan, Weishui ; Hirasawa, Kotaro ; Furuzuki, Takayuki ; Jin, Chunzhi. / New method to prune the neural network. Proceedings of the International Joint Conference on Neural Networks. Vol. 6 Piscataway, NJ, United States : IEEE, 2000. pp. 449-454
@inproceedings{9543e586c71d4ad0b374f77bf8c1b60c,
title = "New method to prune the neural network",
abstract = "Using backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. But its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional relations between the input and output data, so in this aspect some kinds of structure optimization are needed to improve its poor performance. In this paper with this in mind a new method to prune neural networks is proposed based on some statistical quantities of neural networks. Comparing with the other known pruning methods such as structural learning with forgetting (SLF) and RPROP algorithm, the proposed method can attain comparable or even better results over these methods without evident increase of the computational load. Detailed simulations using the Iris data sets exhibit our above assertion.",
author = "Weishui Wan and Kotaro Hirasawa and Takayuki Furuzuki and Chunzhi Jin",
year = "2000",
language = "English",
volume = "6",
pages = "449--454",
booktitle = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "IEEE",

}

TY - GEN

T1 - New method to prune the neural network

AU - Wan, Weishui

AU - Hirasawa, Kotaro

AU - Furuzuki, Takayuki

AU - Jin, Chunzhi

PY - 2000

Y1 - 2000

N2 - Using backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. But its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional relations between the input and output data, so in this aspect some kinds of structure optimization are needed to improve its poor performance. In this paper with this in mind a new method to prune neural networks is proposed based on some statistical quantities of neural networks. Comparing with the other known pruning methods such as structural learning with forgetting (SLF) and RPROP algorithm, the proposed method can attain comparable or even better results over these methods without evident increase of the computational load. Detailed simulations using the Iris data sets exhibit our above assertion.

AB - Using backpropagation algorithm (BP) to train neural networks is a widely adopted practice in both theory and practical applications. But its distributed weight representation, that is the weight matrix of final network after training by using BP are usually not sparsified, and prohibits its use in the rule discovery of inherent functional relations between the input and output data, so in this aspect some kinds of structure optimization are needed to improve its poor performance. In this paper with this in mind a new method to prune neural networks is proposed based on some statistical quantities of neural networks. Comparing with the other known pruning methods such as structural learning with forgetting (SLF) and RPROP algorithm, the proposed method can attain comparable or even better results over these methods without evident increase of the computational load. Detailed simulations using the Iris data sets exhibit our above assertion.

UR - http://www.scopus.com/inward/record.url?scp=0033703422&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033703422&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0033703422

VL - 6

SP - 449

EP - 454

BT - Proceedings of the International Joint Conference on Neural Networks

PB - IEEE

CY - Piscataway, NJ, United States

ER -