Abstract
Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design.
Original language | English |
---|---|
Title of host publication | IJCNN. International Joint Conference on Neural Networks |
Place of Publication | Piscataway, NJ, United States |
Publisher | Publ by IEEE |
Pages | 299-306 |
Number of pages | 8 |
Publication status | Published - 1990 |
Externally published | Yes |
Event | 1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3) - San Diego, CA, USA Duration: 1990 Jun 17 → 1990 Jun 21 |
Other
Other | 1990 International Joint Conference on Neural Networks - IJCNN 90 Part 3 (of 3) |
---|---|
City | San Diego, CA, USA |
Period | 90/6/17 → 90/6/21 |
ASJC Scopus subject areas
- Engineering(all)