Improving Generalization Performance of Natural Gradient Learning Using Optimized Regularization by NIC

Hyeyoung Park, Noboru Murata, Shun Ichi Amari

研究成果: Article査読

13 被引用数 (Scopus)

抄録

Natural gradient learning is known to be efficient in escaping plateau, which is a main cause of the slow learning speed of neural networks. The adaptive natural gradient learning method for practical implementation also has been developed, and its advantage in real-world problems has been confirmed. In this letter, we deal with the generalization performances of the natural gradient method. Since natural gradient learning makes parameters fit to training data quickly, the overfitting phenomenon may easily occur, which results in poor generalization performance. To solve the problem, we introduce the regularization term in natural gradient learning and propose an efficient optimizing method for the scale of regularization by using a generalized Akaike information criterion (network information criterion). We discuss the properties of the optimized regularization strength by NIC through theoretical analysis as well as computer simulations. We confirm the computational efficiency and generalization performance of the proposed method in real-world applications through computational experiments on benchmark problems.

本文言語English
ページ(範囲)355-382
ページ数28
ジャーナルNeural Computation
16
2
DOI
出版ステータスPublished - 2004 2 1

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

フィンガープリント 「Improving Generalization Performance of Natural Gradient Learning Using Optimized Regularization by NIC」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル