Adversarial knowledge distillation for a compact generator

Hideki Tsunashima*, Hirokatsu Kataoka, Junji Yamato, Qiu Chen, Shigeo Morishima

*この研究の対応する著者

研究成果: Conference contribution

抄録

In this paper, we propose memory-efficient Generative Adversarial Nets (GANs) in line with knowledge distillation. Most existing GANs have a shortcoming in terms of the number of model parameters and low processing speed. Here, to tackle the problem, we propose Adversarial Knowledge Distillation for Generative models (AKDG) for highly efficient GANs, in terms of unconditional generation. Using AKDG, model size and processing speed are substantively reduced. Through an adversarial training exercise with a distillation discriminator, a student generator successfully mimics a teacher generator in fewer model layers and fewer parameters and at a higher processing speed. Moreover, our AKDG is network architecture-agnostic. A Comparison of AKDG-applied models to vanilla models suggests that it achieves closer scores to a teacher generator and more efficient performance than a baseline method with respect to Inception Score (IS) and Frechet Inception Distance (FID). In CIFAR-10 experiments, improving IS/FID 1.17pt/55.19pt and in LSUN bedroom experiments, improving FID 71.1pt in comparison to the conventional distillation method for GANs.

本文言語English
ホスト出版物のタイトルProceedings of ICPR 2020 - 25th International Conference on Pattern Recognition
出版社Institute of Electrical and Electronics Engineers Inc.
ページ10636-10643
ページ数8
ISBN(電子版)9781728188089
DOI
出版ステータスPublished - 2020
イベント25th International Conference on Pattern Recognition, ICPR 2020 - Virtual, Milan, Italy
継続期間: 2021 1 102021 1 15

出版物シリーズ

名前Proceedings - International Conference on Pattern Recognition
ISSN(印刷版)1051-4651

Conference

Conference25th International Conference on Pattern Recognition, ICPR 2020
国/地域Italy
CityVirtual, Milan
Period21/1/1021/1/15

ASJC Scopus subject areas

  • コンピュータ ビジョンおよびパターン認識

フィンガープリント

「Adversarial knowledge distillation for a compact generator」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル