Riemannian stochastic variance reduced gradient

Hiroyuki Sato, Hiroyuki Kasai, Bamdev Mishra

Research output: Contribution to journalArticlepeer-review

Abstract

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large but finite number of loss functions. In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a manifold search space. The key challenges of averaging, adding, and subtracting multiple gradients are addressed with retraction and vector transport. We present a global convergence analysis of the proposed algorithm with a decay step size and a local convergence rate analysis under a fixed step size under some natural assumptions. The proposed algorithm is applied to problems on the Grassmann manifold, such as principal component analysis, low-rank matrix completion, and computation of the Karcher mean of subspaces, and outperforms the standard Riemannian stochastic gradient descent algorithm in each case1.

Original languageEnglish
JournalUnknown Journal
Publication statusPublished - 2017 Feb 18
Externally publishedYes

Keywords

  • Matrix completion
  • Retraction
  • Riemannian optimization
  • Stochastic variance reduced gradient
  • Vector transport

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'Riemannian stochastic variance reduced gradient'. Together they form a unique fingerprint.

Cite this