### Abstract

This paper proposes the application of neural networks with multiplication units to parity-N problem, mirror symmetry problem and a function approximation problem. It is clear that, higher-order terms in neural networks, such as sigma-pi unit, can improve the computational power of neural networks considerably. But how the real neurons do this is still unclear. We have used one multiplication unit to construct full higher-order terms of all the inputs, which was proved very efficient for parity-N problem. Our earlier work on applying multiplication units to other problems suffered from the drawback of gradient-based algorithm, such as backpropagation algorithms, for being easy to stuck at local minima due to the complexity of the network. In order to overcome this problem we consider a novel random search, RasID, for the training of neural networks with multiplication units, which does an intensified search where it is easy to find good solutions locally and a diversified search to escape from local minima under a pure random search scheme. The method shows its advantage on the training of neural networks with multiplication units.

Original language | English |
---|---|

Title of host publication | ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

Pages | 75-79 |

Number of pages | 5 |

Volume | 1 |

ISBN (Print) | 9810475241, 9789810475246 |

DOIs | |

Publication status | Published - 2002 |

Externally published | Yes |

Event | 9th International Conference on Neural Information Processing, ICONIP 2002 - Singapore, Singapore Duration: 2002 Nov 18 → 2002 Nov 22 |

### Other

Other | 9th International Conference on Neural Information Processing, ICONIP 2002 |
---|---|

Country | Singapore |

City | Singapore |

Period | 02/11/18 → 02/11/22 |

### Fingerprint

### ASJC Scopus subject areas

- Computer Networks and Communications
- Information Systems
- Signal Processing

### Cite this

*ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age*(Vol. 1, pp. 75-79). [1202134] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICONIP.2002.1202134

**Multiplication units in feedforward neural networks and its training.** / Li, Dazi; Hirasawa, K.; Furuzuki, Takayuki; Murata, J.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age.*vol. 1, 1202134, Institute of Electrical and Electronics Engineers Inc., pp. 75-79, 9th International Conference on Neural Information Processing, ICONIP 2002, Singapore, Singapore, 02/11/18. https://doi.org/10.1109/ICONIP.2002.1202134

}

TY - GEN

T1 - Multiplication units in feedforward neural networks and its training

AU - Li, Dazi

AU - Hirasawa, K.

AU - Furuzuki, Takayuki

AU - Murata, J.

PY - 2002

Y1 - 2002

N2 - This paper proposes the application of neural networks with multiplication units to parity-N problem, mirror symmetry problem and a function approximation problem. It is clear that, higher-order terms in neural networks, such as sigma-pi unit, can improve the computational power of neural networks considerably. But how the real neurons do this is still unclear. We have used one multiplication unit to construct full higher-order terms of all the inputs, which was proved very efficient for parity-N problem. Our earlier work on applying multiplication units to other problems suffered from the drawback of gradient-based algorithm, such as backpropagation algorithms, for being easy to stuck at local minima due to the complexity of the network. In order to overcome this problem we consider a novel random search, RasID, for the training of neural networks with multiplication units, which does an intensified search where it is easy to find good solutions locally and a diversified search to escape from local minima under a pure random search scheme. The method shows its advantage on the training of neural networks with multiplication units.

AB - This paper proposes the application of neural networks with multiplication units to parity-N problem, mirror symmetry problem and a function approximation problem. It is clear that, higher-order terms in neural networks, such as sigma-pi unit, can improve the computational power of neural networks considerably. But how the real neurons do this is still unclear. We have used one multiplication unit to construct full higher-order terms of all the inputs, which was proved very efficient for parity-N problem. Our earlier work on applying multiplication units to other problems suffered from the drawback of gradient-based algorithm, such as backpropagation algorithms, for being easy to stuck at local minima due to the complexity of the network. In order to overcome this problem we consider a novel random search, RasID, for the training of neural networks with multiplication units, which does an intensified search where it is easy to find good solutions locally and a diversified search to escape from local minima under a pure random search scheme. The method shows its advantage on the training of neural networks with multiplication units.

UR - http://www.scopus.com/inward/record.url?scp=84965025652&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84965025652&partnerID=8YFLogxK

U2 - 10.1109/ICONIP.2002.1202134

DO - 10.1109/ICONIP.2002.1202134

M3 - Conference contribution

AN - SCOPUS:84965025652

SN - 9810475241

SN - 9789810475246

VL - 1

SP - 75

EP - 79

BT - ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age

PB - Institute of Electrical and Electronics Engineers Inc.

ER -