TY - JOUR
T1 - A Riemannian gossip approach to subspace learning on Grassmann manifold
AU - Mishra, Bamdev
AU - Kasai, Hiroyuki
AU - Jawanpuria, Pratik
AU - Saroop, Atul
N1 - Publisher Copyright:
Copyright © 2017, The Authors. All rights reserved.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2017/5/1
Y1 - 2017/5/1
N2 - In this paper, we focus on subspace learning problems on the Grass-mann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Moti-vated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to ar-rive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite sum structure. In the proposed modeling approach, different agents learn individual local subspace but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the ecacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.
AB - In this paper, we focus on subspace learning problems on the Grass-mann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Moti-vated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to ar-rive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite sum structure. In the proposed modeling approach, different agents learn individual local subspace but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the ecacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.
UR - http://www.scopus.com/inward/record.url?scp=85093021460&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85093021460&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85093021460
JO - Nuclear Physics A
JF - Nuclear Physics A
SN - 0375-9474
ER -