TY - GEN
T1 - MEMORY-EFFICIENT LEARNED IMAGE COMPRESSION WITH PRUNED HYPERPRIOR MODULE
AU - Luo, Ao
AU - Sun, Heming
AU - Liu, Jinming
AU - Katto, Jiro
N1 - Funding Information:
This work was supported by JST, PRESTO Grant Number JPMJPR19M5, Japan; NICT, Grant Number 03801, Japan; Japan Society for the Promotion of Science (JSPS), Grant Number 21K17770; Kenjiro Takayanagi Foundation.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Learned Image Compression (LIC) gradually became more and more famous in these years. The hyperprior-module-based LIC models have achieved remarkable rate-distortion performance. However, the memory cost of these LIC models is too large to actually apply them to various devices, especially to portable or edge devices. The parameter scale is directly linked with memory cost. In our research, we found the hyperprior module is not only highly over-parameterized, but also its latent representation contains redundant information. Therefore, we propose a novel pruning method named ERHP in this paper to efficiently reduce the memory cost of hyperprior module, while improving the network performance. The experiments show our method is effective, reducing at least 22.6% parameters in the whole model while achieving better rate-distortion performance.
AB - Learned Image Compression (LIC) gradually became more and more famous in these years. The hyperprior-module-based LIC models have achieved remarkable rate-distortion performance. However, the memory cost of these LIC models is too large to actually apply them to various devices, especially to portable or edge devices. The parameter scale is directly linked with memory cost. In our research, we found the hyperprior module is not only highly over-parameterized, but also its latent representation contains redundant information. Therefore, we propose a novel pruning method named ERHP in this paper to efficiently reduce the memory cost of hyperprior module, while improving the network performance. The experiments show our method is effective, reducing at least 22.6% parameters in the whole model while achieving better rate-distortion performance.
KW - Hyperprior Module
KW - Learned Image Compression
KW - Model Pruning
UR - http://www.scopus.com/inward/record.url?scp=85146669793&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85146669793&partnerID=8YFLogxK
U2 - 10.1109/ICIP46576.2022.9897854
DO - 10.1109/ICIP46576.2022.9897854
M3 - Conference contribution
AN - SCOPUS:85146669793
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 3061
EP - 3065
BT - 2022 IEEE International Conference on Image Processing, ICIP 2022 - Proceedings
PB - IEEE Computer Society
T2 - 29th IEEE International Conference on Image Processing, ICIP 2022
Y2 - 16 October 2022 through 19 October 2022
ER -