Feature hallucination via Maximum A Posteriori for few-shot learning[Formula presented]

Jiaying Wu, Ning Dong, Fan Liu, Sai Yang, Jinglu Hu*

*この研究の対応する著者

研究成果: Article査読

抄録

Few-shot learning aims to train an effective classifier in a small data regime. Due to the scarcity of training samples (usually as small as 1 or 5), traditional deep learning solutions often suffer from overfitting. To address this issue, an intuitive idea is to augment or hallucinate sufficient training data. For this purpose, in this paper, we propose a simple yet effective method to build a model for novel categories with few samples. Specifically, we assume that each category in the base set follows a Gaussian distribution, so that we can employ Maximum A Posteriori (MAP) to estimate the distribution of a novel category with even one example. To achieve this goal, we first transform each base category into Gaussian form with power transformation for MAP estimation. Then, we estimate the Gaussian mean of the novel category under the Gaussian prior given few samples from it. Finally, each novel category is represented by a unique Gaussian distribution, where sufficient trainable features can be sampled to obtain a highly accurate classifier for final predictions. Experimental results on four few-shot benchmarks show that it significantly outperforms the baseline methods on both 1- and 5-shot tasks. Extensive results on cross-domain tasks and visualization of estimated feature distribution also demonstrate its effectiveness.

本文言語English
論文番号107129
ジャーナルKnowledge-Based Systems
225
DOI
出版ステータスPublished - 2021 8 5

ASJC Scopus subject areas

  • 管理情報システム
  • ソフトウェア
  • 情報システムおよび情報管理
  • 人工知能

フィンガープリント

「Feature hallucination via Maximum A Posteriori for few-shot learning[Formula presented]」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル