Multiple descent cost competitive learning

Batch and successive self-organization with excitatory and inhibitory connections

Yasuo Matsuyama

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.

Original languageEnglish
Title of host publicationIJCNN. International Joint Conference on Neural Networks
Place of PublicationPiscataway, NJ, United States
PublisherPubl by IEEE
Pages299-306
Number of pages8
Publication statusPublished - 1990
Externally publishedYes
Event1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3) - San Diego, CA, USA
Duration: 1990 Jun 171990 Jun 21

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3)
CitySan Diego, CA, USA
Period90/6/1790/6/21

Fingerprint

Costs
Combinatorial optimization
Neurons
Topology
Neural networks

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Matsuyama, Y. (1990). Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections. In IJCNN. International Joint Conference on Neural Networks (pp. 299-306). Piscataway, NJ, United States: Publ by IEEE.

Multiple descent cost competitive learning : Batch and successive self-organization with excitatory and inhibitory connections. / Matsuyama, Yasuo.

IJCNN. International Joint Conference on Neural Networks. Piscataway, NJ, United States : Publ by IEEE, 1990. p. 299-306.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Matsuyama, Y 1990, Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections. in IJCNN. International Joint Conference on Neural Networks. Publ by IEEE, Piscataway, NJ, United States, pp. 299-306, 1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3), San Diego, CA, USA, 90/6/17.
Matsuyama Y. Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections. In IJCNN. International Joint Conference on Neural Networks. Piscataway, NJ, United States: Publ by IEEE. 1990. p. 299-306
Matsuyama, Yasuo. / Multiple descent cost competitive learning : Batch and successive self-organization with excitatory and inhibitory connections. IJCNN. International Joint Conference on Neural Networks. Piscataway, NJ, United States : Publ by IEEE, 1990. pp. 299-306
@inproceedings{9f5b1d9d0d0947b3a20015c543adb5dd,
title = "Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections",
abstract = "Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.",
author = "Yasuo Matsuyama",
year = "1990",
language = "English",
pages = "299--306",
booktitle = "IJCNN. International Joint Conference on Neural Networks",
publisher = "Publ by IEEE",

}

TY - GEN

T1 - Multiple descent cost competitive learning

T2 - Batch and successive self-organization with excitatory and inhibitory connections

AU - Matsuyama, Yasuo

PY - 1990

Y1 - 1990

N2 - Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.

AB - Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.

UR - http://www.scopus.com/inward/record.url?scp=0025548894&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025548894&partnerID=8YFLogxK

M3 - Conference contribution

SP - 299

EP - 306

BT - IJCNN. International Joint Conference on Neural Networks

PB - Publ by IEEE

CY - Piscataway, NJ, United States

ER -