Riemannian Stochastic recursive gradient algorithm with retraction and vector transport and its convergence analysis

Hiroyuki Kasai, Hiroyuki Sato, Bamdev Mishra

研究成果: Conference contribution

3 被引用数 (Scopus)

抄録

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions on a Riemannian manifold. The present paper proposes a Riemannian stochastic recursive gradient algorithm (R-SRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of R-SRG are performed on both retractionconvex and non-convex functions under computationally efficient retraction and vector transport operations. The key challenge is analysis of the influence of vector transport along the retraction curve. Numerical evaluations reveal that R-SRG competes well with state-of-the-art Riemannian batch and stochastic gradient algorithms.

本文言語English
ホスト出版物のタイトル35th International Conference on Machine Learning, ICML 2018
編集者Jennifer Dy, Andreas Krause
出版社International Machine Learning Society (IMLS)
ページ3912-3935
ページ数24
ISBN(電子版)9781510867963
出版ステータスPublished - 2018
外部発表はい
イベント35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
継続期間: 2018 7 102018 7 15

出版物シリーズ

名前35th International Conference on Machine Learning, ICML 2018
6

Conference

Conference35th International Conference on Machine Learning, ICML 2018
CountrySweden
CityStockholm
Period18/7/1018/7/15

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

引用スタイル