Paper
12 June 2020 MetaAMC: meta learning and AutoML for model compression
Chaopeng Zhang, Yuesheng Zhu, Zhiqiang Bai
Author Affiliations +
Proceedings Volume 11519, Twelfth International Conference on Digital Image Processing (ICDIP 2020); 115191U (2020) https://doi.org/10.1117/12.2573095
Event: Twelfth International Conference on Digital Image Processing, 2020, Osaka, Japan
Abstract
Model compression reduces the computation costs of an over-parameterized network without performance damage, and channel pruning is among the predominant approaches to compress deep neural networks. In this paper, we propose a novel approach called MetaAMC combining meta learning and Auto ML for automatic channel pruning of very deep neural networks. It leverages meta learning and reinforcement learning to provide the model compression policy. Compared to not only the original AMC and MetaPruning but also the state-of-the-art pruning methods, we have demonstrated superior performances on MobileNet V1/V2 and ResNet-50.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chaopeng Zhang, Yuesheng Zhu, and Zhiqiang Bai "MetaAMC: meta learning and AutoML for model compression", Proc. SPIE 11519, Twelfth International Conference on Digital Image Processing (ICDIP 2020), 115191U (12 June 2020); https://doi.org/10.1117/12.2573095
Lens.org Logo
CITATIONS
Cited by 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Performance modeling

Convolution

Image classification

Neural networks

Convolutional neural networks

Electrochemical etching

Image compression

Back to Top