A Riemannian approach to low-rank tensor learning

Hiroyuki Kasai, Pratik Jawanpuria, Bamdev Mishra

研究成果: Chapter

抄録

This chapter discusses a Riemannian approach to the low-rank tensor learning problem. The low-rank constraint is modeled as a fixed-rank Tucker decomposition of tensors. We endow the manifold arising from the Tucker decomposition with a Riemannian structure based on a specific metric or inner product. It allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop optimization algorithms. The Riemannian framework conceptually translates a structured constraint problem into an unconstrained problem over a Riemannian manifold. We employ a nonlinear conjugate gradient algorithm for optimization. To this end, concrete matrix expressions of various Riemannian optimization-related ingredients are discussed. Numerical comparisons on problems of low-rank tensor completion, tensor regression, and multilinear multitask learning suggest that the proposed Riemannian approach performs well across different synthetic and real-world datasets.

本文言語English
ホスト出版物のタイトルTensors for Data Processing
ホスト出版物のサブタイトルTheory, Methods, and Applications
出版社Elsevier
ページ91-119
ページ数29
ISBN(電子版)9780128244470
ISBN(印刷版)9780323859653
DOI
出版ステータスPublished - 2021 1月 1

ASJC Scopus subject areas

  • 工学(全般)
  • コンピュータ サイエンス(全般)

フィンガープリント

「A Riemannian approach to low-rank tensor learning」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル