Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections

Yasuo Matsuyama*

*この研究の対応する著者

研究成果: Conference contribution

1 被引用数 (Scopus)

抄録

Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.

本文言語English
ホスト出版物のタイトルIJCNN. International Joint Conference on Neural Networks
Place of PublicationPiscataway, NJ, United States
出版社Publ by IEEE
ページ299-306
ページ数8
出版ステータスPublished - 1990
外部発表はい
イベント1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3) - San Diego, CA, USA
継続期間: 1990 6 171990 6 21

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3)
CitySan Diego, CA, USA
Period90/6/1790/6/21

ASJC Scopus subject areas

  • 工学(全般)

フィンガープリント

「Multiple descent cost competitive learning: Batch and successive self-organization with excitatory and inhibitory connections」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル