Abstract
This paper studies how the generalization ability of models of dynamic systems can be improved by taking advantages of the second order derivatives of the outputs of networks with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then a new method for implementing the regularization term is presented. Finally simulation studies on identification of a nonlinear dynamic system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.
Original language | English |
---|---|
Pages | 1203-1208 |
Number of pages | 6 |
Publication status | Published - 2001 Jan 1 |
Externally published | Yes |
Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States Duration: 2001 Jul 15 → 2001 Jul 19 |
Conference
Conference | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|
Country/Territory | United States |
City | Washington, DC |
Period | 01/7/15 → 01/7/19 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence