### Abstract

Universal Learning Networks(ULNs) which are super set of supervised learning networks have been already proposed. They consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them. Most of the functions used are sigmoidal functions. Disadvantages of exiting ULNs mainly lie in that long training time, a large number of nodes in hidden layers, and so on. In this paper, special ULNs with multiplication neurons(M neuron) are proposed, which have M neurons in the hidden layer and normal neurons with sigmoidal functions in the output layer. The computational power of networks models with multiplication neurons is compared with that of ULNs with existing neurons. In particular it is proved that ULNs with multiplication neurons are, with regard to the number of neurons that are needed, computationally more powerful than ULNs with normal sigmodial functions.

Original language | English |
---|---|

Title of host publication | Proceedings of the International Joint Conference on Neural Networks |

Pages | 150-155 |

Number of pages | 6 |

Volume | 1 |

Publication status | Published - 2001 |

Externally published | Yes |

Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC Duration: 2001 Jul 15 → 2001 Jul 19 |

### Other

Other | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|

City | Washington, DC |

Period | 01/7/15 → 01/7/19 |

### Fingerprint

### ASJC Scopus subject areas

- Software

### Cite this

*Proceedings of the International Joint Conference on Neural Networks*(Vol. 1, pp. 150-155)

**Universal learning networks with multiplication neurons and its representation ability.** / Li, D.; Hirasawa, K.; Furuzuki, Takayuki; Murata, J.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the International Joint Conference on Neural Networks.*vol. 1, pp. 150-155, International Joint Conference on Neural Networks (IJCNN'01), Washington, DC, 01/7/15.

}

TY - GEN

T1 - Universal learning networks with multiplication neurons and its representation ability

AU - Li, D.

AU - Hirasawa, K.

AU - Furuzuki, Takayuki

AU - Murata, J.

PY - 2001

Y1 - 2001

N2 - Universal Learning Networks(ULNs) which are super set of supervised learning networks have been already proposed. They consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them. Most of the functions used are sigmoidal functions. Disadvantages of exiting ULNs mainly lie in that long training time, a large number of nodes in hidden layers, and so on. In this paper, special ULNs with multiplication neurons(M neuron) are proposed, which have M neurons in the hidden layer and normal neurons with sigmoidal functions in the output layer. The computational power of networks models with multiplication neurons is compared with that of ULNs with existing neurons. In particular it is proved that ULNs with multiplication neurons are, with regard to the number of neurons that are needed, computationally more powerful than ULNs with normal sigmodial functions.

AB - Universal Learning Networks(ULNs) which are super set of supervised learning networks have been already proposed. They consist of a number of inter-connected nodes where the nodes may have any continuously differentiable nonlinear functions in them. Most of the functions used are sigmoidal functions. Disadvantages of exiting ULNs mainly lie in that long training time, a large number of nodes in hidden layers, and so on. In this paper, special ULNs with multiplication neurons(M neuron) are proposed, which have M neurons in the hidden layer and normal neurons with sigmoidal functions in the output layer. The computational power of networks models with multiplication neurons is compared with that of ULNs with existing neurons. In particular it is proved that ULNs with multiplication neurons are, with regard to the number of neurons that are needed, computationally more powerful than ULNs with normal sigmodial functions.

UR - http://www.scopus.com/inward/record.url?scp=0034849789&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034849789&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0034849789

VL - 1

SP - 150

EP - 155

BT - Proceedings of the International Joint Conference on Neural Networks

ER -