Learning Scale and Shift-Invariant Dictionary for Sparse Representation

Toshimitsu Aritake, Noboru Murata

研究成果: Conference contribution

抄録

Sparse representation is a signal model to represent signals with a linear combination of a small number of prototype signals called atoms, and a set of atoms is called a dictionary. The design of the dictionary is a fundamental problem for sparse representation. However, when there are scaled or translated features in the signals, unstructured dictionary models cannot extract such features. In this paper, we propose a structured dictionary model which is scale and shift-invariant to extract features which commonly appear in several scales and locations. To achieve both scale and shift invariance, we assume that atoms of a dictionary are generated from vectors called ancestral atoms by scaling and shift operations, and an algorithm to learn these ancestral atoms is proposed.

本文言語English
ホスト出版物のタイトルMachine Learning, Optimization, and Data Science - 5th International Conference, LOD 2019, Proceedings
編集者Giuseppe Nicosia, Panos Pardalos, Renato Umeton, Giovanni Giuffrida, Vincenzo Sciacca
出版社Springer
ページ472-483
ページ数12
ISBN(印刷版)9783030375980
DOI
出版ステータスPublished - 2019
イベント5th International Conference on Machine Learning, Optimization, and Data Science, LOD 2019 - Siena, Italy
継続期間: 2019 9 102019 9 13

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
11943 LNCS
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

Conference

Conference5th International Conference on Machine Learning, Optimization, and Data Science, LOD 2019
CountryItaly
CitySiena
Period19/9/1019/9/13

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

フィンガープリント 「Learning Scale and Shift-Invariant Dictionary for Sparse Representation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル