### Abstract

This paper studies how the generalization ability of models of dynamic systems can be improved by taking advantages of the second order derivatives of the outputs of networks with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then a new method for implementing the regularization term is presented. Finally simulation studies on identification of a nonlinear dynamic system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.

Original language | English |
---|---|

Title of host publication | Proceedings of the International Joint Conference on Neural Networks |

Pages | 1203-1208 |

Number of pages | 6 |

Volume | 2 |

Publication status | Published - 2001 |

Externally published | Yes |

Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC Duration: 2001 Jul 15 → 2001 Jul 19 |

### Other

Other | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|

City | Washington, DC |

Period | 01/7/15 → 01/7/19 |

### Fingerprint

### ASJC Scopus subject areas

- Software

### Cite this

*Proceedings of the International Joint Conference on Neural Networks*(Vol. 2, pp. 1203-1208)

**Improvement of generalization ability for identifying dynamic systems by using universal learning networks.** / Kim, S.; Hirasawa, K.; Furuzuki, Takayuki.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the International Joint Conference on Neural Networks.*vol. 2, pp. 1203-1208, International Joint Conference on Neural Networks (IJCNN'01), Washington, DC, 01/7/15.

}

TY - GEN

T1 - Improvement of generalization ability for identifying dynamic systems by using universal learning networks

AU - Kim, S.

AU - Hirasawa, K.

AU - Furuzuki, Takayuki

PY - 2001

Y1 - 2001

N2 - This paper studies how the generalization ability of models of dynamic systems can be improved by taking advantages of the second order derivatives of the outputs of networks with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then a new method for implementing the regularization term is presented. Finally simulation studies on identification of a nonlinear dynamic system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.

AB - This paper studies how the generalization ability of models of dynamic systems can be improved by taking advantages of the second order derivatives of the outputs of networks with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then a new method for implementing the regularization term is presented. Finally simulation studies on identification of a nonlinear dynamic system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.

UR - http://www.scopus.com/inward/record.url?scp=0034878580&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034878580&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0034878580

VL - 2

SP - 1203

EP - 1208

BT - Proceedings of the International Joint Conference on Neural Networks

ER -