### Abstract

Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

Original language | English |
---|---|

Title of host publication | Proceedings of the International Joint Conference on Neural Networks |

Place of Publication | United States |

Publisher | IEEE |

Pages | 1302-1307 |

Number of pages | 6 |

Volume | 2 |

Publication status | Published - 1999 |

Externally published | Yes |

Event | International Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA Duration: 1999 Jul 10 → 1999 Jul 16 |

### Other

Other | International Joint Conference on Neural Networks (IJCNN'99) |
---|---|

City | Washington, DC, USA |

Period | 99/7/10 → 99/7/16 |

### Fingerprint

### ASJC Scopus subject areas

- Software

### Cite this

*Proceedings of the International Joint Conference on Neural Networks*(Vol. 2, pp. 1302-1307). United States: IEEE.

**Universal Learning Networks with varying parameters.** / Hirasawa, Kotaro; Furuzuki, Takayuki; Murata, Junichi; Jin, Chunzhi; Etoh, Hironobu; Katagiri, Hironobu.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the International Joint Conference on Neural Networks.*vol. 2, IEEE, United States, pp. 1302-1307, International Joint Conference on Neural Networks (IJCNN'99), Washington, DC, USA, 99/7/10.

}

TY - GEN

T1 - Universal Learning Networks with varying parameters

AU - Hirasawa, Kotaro

AU - Furuzuki, Takayuki

AU - Murata, Junichi

AU - Jin, Chunzhi

AU - Etoh, Hironobu

AU - Katagiri, Hironobu

PY - 1999

Y1 - 1999

N2 - Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

AB - Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

UR - http://www.scopus.com/inward/record.url?scp=0033351399&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033351399&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0033351399

VL - 2

SP - 1302

EP - 1307

BT - Proceedings of the International Joint Conference on Neural Networks

PB - IEEE

CY - United States

ER -