### 抄録

We aim at an extension of AdaBoost to U-Boost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the U-Boost method in the framework of information geometry extended to the space of the finite measures over a label set. We propose two versions of U-Boost learning algorithms by taking account of whether the domain is restricted to the space of probability functions. In the sequential step, we observe that the two adjacent and the initial classifiers are associated with a right triangle in the scale via the Bregman divergence, called the Pythagorean relation. This leads to a mild convergence property of the U-Boost algorithm as seen in the expectation-maximization algorithm. Statistical discussions for consistency and robustness elucidate the properties of the U-Boost methods based on a stochastic assumption for training data.

元の言語 | English |
---|---|

ページ（範囲） | 1437-1481 |

ページ数 | 45 |

ジャーナル | Neural Computation |

巻 | 16 |

発行部数 | 7 |

DOI | |

出版物ステータス | Published - 2004 7 |

### Fingerprint

### ASJC Scopus subject areas

- Control and Systems Engineering
- Artificial Intelligence
- Neuroscience(all)

### これを引用

*Neural Computation*,

*16*(7), 1437-1481. https://doi.org/10.1162/089976604323057452

**Information geometry of U-Boost and Bregman divergence.** / Murata, Noboru; Takenouchi, Takashi; Kanamori, Takafumi; Eguchi, Shinto.

研究成果: Article

*Neural Computation*, 巻. 16, 番号 7, pp. 1437-1481. https://doi.org/10.1162/089976604323057452

}

TY - JOUR

T1 - Information geometry of U-Boost and Bregman divergence

AU - Murata, Noboru

AU - Takenouchi, Takashi

AU - Kanamori, Takafumi

AU - Eguchi, Shinto

PY - 2004/7

Y1 - 2004/7

N2 - We aim at an extension of AdaBoost to U-Boost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the U-Boost method in the framework of information geometry extended to the space of the finite measures over a label set. We propose two versions of U-Boost learning algorithms by taking account of whether the domain is restricted to the space of probability functions. In the sequential step, we observe that the two adjacent and the initial classifiers are associated with a right triangle in the scale via the Bregman divergence, called the Pythagorean relation. This leads to a mild convergence property of the U-Boost algorithm as seen in the expectation-maximization algorithm. Statistical discussions for consistency and robustness elucidate the properties of the U-Boost methods based on a stochastic assumption for training data.

AB - We aim at an extension of AdaBoost to U-Boost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the U-Boost method in the framework of information geometry extended to the space of the finite measures over a label set. We propose two versions of U-Boost learning algorithms by taking account of whether the domain is restricted to the space of probability functions. In the sequential step, we observe that the two adjacent and the initial classifiers are associated with a right triangle in the scale via the Bregman divergence, called the Pythagorean relation. This leads to a mild convergence property of the U-Boost algorithm as seen in the expectation-maximization algorithm. Statistical discussions for consistency and robustness elucidate the properties of the U-Boost methods based on a stochastic assumption for training data.

UR - http://www.scopus.com/inward/record.url?scp=2942627097&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=2942627097&partnerID=8YFLogxK

U2 - 10.1162/089976604323057452

DO - 10.1162/089976604323057452

M3 - Article

C2 - 15165397

AN - SCOPUS:2942627097

VL - 16

SP - 1437

EP - 1481

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 7

ER -