Growing network that optimizes between undertraining and overtraining

Goutam Chakraborty*, Mitsuru Murakami, Norio Shiratori, Shoichi Noguchi

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    3 Citations (Scopus)

    Abstract

    Feedforward neural network classifier trained with a finite set of available sample tries to estimate properly the different class boundaries in the input feature space. This enables the network to classify unknown new samples with some confidence. A new method for ascertaining proper network size for maximizing generalization as well as correct classification is proposed. An algorithm is also proposed to grow the network to that size.

    Original languageEnglish
    Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
    Place of PublicationPiscataway, NJ, United States
    PublisherIEEE
    Pages1116-1120
    Number of pages5
    Volume2
    Publication statusPublished - 1995
    EventProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6) - Perth, Aust
    Duration: 1995 Nov 271995 Dec 1

    Other

    OtherProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6)
    CityPerth, Aust
    Period95/11/2795/12/1

    ASJC Scopus subject areas

    • Software

    Fingerprint

    Dive into the research topics of 'Growing network that optimizes between undertraining and overtraining'. Together they form a unique fingerprint.

    Cite this