### Abstract

Neural networks are widely known to provide a method of approximating nonlinear functions. In order to clarify its approximation ability, a new theorem on an integral transform of ridge functions is presented. By using this theorem, an approximation bound, which evaluates the quantitative relationship between the approximation accuracy and the number of elements in the hidden layer, can be obtained. This result shows that the approximation accuracy depends on the smoothness of target functions. It also shows that the approximation methods which use ridge functions are free from the 'curse of dimensionality'.

Original language | English |
---|---|

Pages (from-to) | 947-956 |

Number of pages | 10 |

Journal | Neural Networks |

Volume | 9 |

Issue number | 6 |

DOIs | |

Publication status | Published - 1996 Aug |

Externally published | Yes |

### Fingerprint

### Keywords

- approximation bound
- curse of dimensionality
- integral transform
- random coding
- ridge function
- three-layered network

### ASJC Scopus subject areas

- Artificial Intelligence
- Neuroscience(all)

### Cite this

**An integral representation of functions using three-layered networks and their approximation bounds.** / Murata, Noboru.

Research output: Contribution to journal › Article

}

TY - JOUR

T1 - An integral representation of functions using three-layered networks and their approximation bounds

AU - Murata, Noboru

PY - 1996/8

Y1 - 1996/8

N2 - Neural networks are widely known to provide a method of approximating nonlinear functions. In order to clarify its approximation ability, a new theorem on an integral transform of ridge functions is presented. By using this theorem, an approximation bound, which evaluates the quantitative relationship between the approximation accuracy and the number of elements in the hidden layer, can be obtained. This result shows that the approximation accuracy depends on the smoothness of target functions. It also shows that the approximation methods which use ridge functions are free from the 'curse of dimensionality'.

AB - Neural networks are widely known to provide a method of approximating nonlinear functions. In order to clarify its approximation ability, a new theorem on an integral transform of ridge functions is presented. By using this theorem, an approximation bound, which evaluates the quantitative relationship between the approximation accuracy and the number of elements in the hidden layer, can be obtained. This result shows that the approximation accuracy depends on the smoothness of target functions. It also shows that the approximation methods which use ridge functions are free from the 'curse of dimensionality'.

KW - approximation bound

KW - curse of dimensionality

KW - integral transform

KW - random coding

KW - ridge function

KW - three-layered network

UR - http://www.scopus.com/inward/record.url?scp=0030221021&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030221021&partnerID=8YFLogxK

U2 - 10.1016/0893-6080(96)00000-7

DO - 10.1016/0893-6080(96)00000-7

M3 - Article

VL - 9

SP - 947

EP - 956

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 6

ER -