Growing network that optimizes between undertraining and overtraining

Goutam Chakraborty, Mitsuru Murakami, Norio Shiratori, Shoichi Noguchi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    3 Citations (Scopus)

    Abstract

    Feedforward neural network classifier trained with a finite set of available sample tries to estimate properly the different class boundaries in the input feature space. This enables the network to classify unknown new samples with some confidence. A new method for ascertaining proper network size for maximizing generalization as well as correct classification is proposed. An algorithm is also proposed to grow the network to that size.

    Original languageEnglish
    Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
    Place of PublicationPiscataway, NJ, United States
    PublisherIEEE
    Pages1116-1120
    Number of pages5
    Volume2
    Publication statusPublished - 1995
    EventProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6) - Perth, Aust
    Duration: 1995 Nov 271995 Dec 1

    Other

    OtherProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6)
    CityPerth, Aust
    Period95/11/2795/12/1

    Fingerprint

    Feedforward neural networks
    Classifiers

    ASJC Scopus subject areas

    • Software

    Cite this

    Chakraborty, G., Murakami, M., Shiratori, N., & Noguchi, S. (1995). Growing network that optimizes between undertraining and overtraining. In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 2, pp. 1116-1120). Piscataway, NJ, United States: IEEE.

    Growing network that optimizes between undertraining and overtraining. / Chakraborty, Goutam; Murakami, Mitsuru; Shiratori, Norio; Noguchi, Shoichi.

    IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 Piscataway, NJ, United States : IEEE, 1995. p. 1116-1120.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Chakraborty, G, Murakami, M, Shiratori, N & Noguchi, S 1995, Growing network that optimizes between undertraining and overtraining. in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 2, IEEE, Piscataway, NJ, United States, pp. 1116-1120, Proceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6), Perth, Aust, 95/11/27.
    Chakraborty G, Murakami M, Shiratori N, Noguchi S. Growing network that optimizes between undertraining and overtraining. In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2. Piscataway, NJ, United States: IEEE. 1995. p. 1116-1120
    Chakraborty, Goutam ; Murakami, Mitsuru ; Shiratori, Norio ; Noguchi, Shoichi. / Growing network that optimizes between undertraining and overtraining. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 Piscataway, NJ, United States : IEEE, 1995. pp. 1116-1120
    @inproceedings{45155ec371db4b68b39c50816f3b360a,
    title = "Growing network that optimizes between undertraining and overtraining",
    abstract = "Feedforward neural network classifier trained with a finite set of available sample tries to estimate properly the different class boundaries in the input feature space. This enables the network to classify unknown new samples with some confidence. A new method for ascertaining proper network size for maximizing generalization as well as correct classification is proposed. An algorithm is also proposed to grow the network to that size.",
    author = "Goutam Chakraborty and Mitsuru Murakami and Norio Shiratori and Shoichi Noguchi",
    year = "1995",
    language = "English",
    volume = "2",
    pages = "1116--1120",
    booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",
    publisher = "IEEE",

    }

    TY - GEN

    T1 - Growing network that optimizes between undertraining and overtraining

    AU - Chakraborty, Goutam

    AU - Murakami, Mitsuru

    AU - Shiratori, Norio

    AU - Noguchi, Shoichi

    PY - 1995

    Y1 - 1995

    N2 - Feedforward neural network classifier trained with a finite set of available sample tries to estimate properly the different class boundaries in the input feature space. This enables the network to classify unknown new samples with some confidence. A new method for ascertaining proper network size for maximizing generalization as well as correct classification is proposed. An algorithm is also proposed to grow the network to that size.

    AB - Feedforward neural network classifier trained with a finite set of available sample tries to estimate properly the different class boundaries in the input feature space. This enables the network to classify unknown new samples with some confidence. A new method for ascertaining proper network size for maximizing generalization as well as correct classification is proposed. An algorithm is also proposed to grow the network to that size.

    UR - http://www.scopus.com/inward/record.url?scp=0029489320&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0029489320&partnerID=8YFLogxK

    M3 - Conference contribution

    VL - 2

    SP - 1116

    EP - 1120

    BT - IEEE International Conference on Neural Networks - Conference Proceedings

    PB - IEEE

    CY - Piscataway, NJ, United States

    ER -