Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections

Yasuo Matsuyama*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.

Original languageEnglish
Title of host publicationIJCNN. International Joint Conference on Neural Networks
Place of PublicationPiscataway, NJ, United States
PublisherPubl by IEEE
Pages299-306
Number of pages8
Publication statusPublished - 1990
Externally publishedYes
Event1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3) - San Diego, CA, USA
Duration: 1990 Jun 171990 Jun 21

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3)
CitySan Diego, CA, USA
Period90/6/1790/6/21

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint

Dive into the research topics of 'Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections'. Together they form a unique fingerprint.

Cite this