Riemannian Stochastic recursive gradient algorithm with retraction and vector transport and its convergence analysis

Hiroyuki Kasai, Hiroyuki Sato, Bamdev Mishra

研究成果: Conference contribution

2 引用 (Scopus)

抜粋

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions on a Riemannian manifold. The present paper proposes a Riemannian stochastic recursive gradient algorithm (R-SRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of R-SRG are performed on both retractionconvex and non-convex functions under computationally efficient retraction and vector transport operations. The key challenge is analysis of the influence of vector transport along the retraction curve. Numerical evaluations reveal that R-SRG competes well with state-of-the-art Riemannian batch and stochastic gradient algorithms.

元の言語English
ホスト出版物のタイトル35th International Conference on Machine Learning, ICML 2018
編集者Jennifer Dy, Andreas Krause
出版者International Machine Learning Society (IMLS)
ページ3912-3935
ページ数24
ISBN(電子版)9781510867963
出版物ステータスPublished - 2018
外部発表Yes
イベント35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
継続期間: 2018 7 102018 7 15

出版物シリーズ

名前35th International Conference on Machine Learning, ICML 2018
6

Conference

Conference35th International Conference on Machine Learning, ICML 2018
Sweden
Stockholm
期間18/7/1018/7/15

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

これを引用

Kasai, H., Sato, H., & Mishra, B. (2018). Riemannian Stochastic recursive gradient algorithm with retraction and vector transport and its convergence analysis. : J. Dy, & A. Krause (版), 35th International Conference on Machine Learning, ICML 2018 (pp. 3912-3935). (35th International Conference on Machine Learning, ICML 2018; 巻数 6). International Machine Learning Society (IMLS).