This chapter discusses a Riemannian approach to the low-rank tensor learning problem. The low-rank constraint is modeled as a fixed-rank Tucker decomposition of tensors. We endow the manifold arising from the Tucker decomposition with a Riemannian structure based on a specific metric or inner product. It allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop optimization algorithms. The Riemannian framework conceptually translates a structured constraint problem into an unconstrained problem over a Riemannian manifold. We employ a nonlinear conjugate gradient algorithm for optimization. To this end, concrete matrix expressions of various Riemannian optimization-related ingredients are discussed. Numerical comparisons on problems of low-rank tensor completion, tensor regression, and multilinear multitask learning suggest that the proposed Riemannian approach performs well across different synthetic and real-world datasets.
|ホスト出版物のタイトル||Tensors for Data Processing|
|ホスト出版物のサブタイトル||Theory, Methods, and Applications|
|出版ステータス||Published - 2021 1月 1|
ASJC Scopus subject areas
- コンピュータ サイエンス（全般）