<span class="var-sub_title">Cross-Layer Group Regularization for Deep Neural Network Pruning</span> SC18 Proceedings

The International Conference for High Performance Computing, Networking, Storage, and Analysis

Cross-Layer Group Regularization for Deep Neural Network Pruning

Authors: Shuang Gao (Nvidia Corporation), Xin Liu (Nvidia Corporation)

Abstract: Improving weights sparsity is a common strategy for deep neural network pruning. Most existing methods use regularizations that only consider structural sparsity within an individual layer. In this paper, we propose a cross-layer group regularization taking into account the statistics from multiple layers. For residual networks, we use this approach to align kernel sparsity across layers that are tied to each other through element-wise operations: the ith kernel of these layers are put into one regularization group, they either stay or be removed simultaneously during pruning. In this way, the computational and parameter storage cost could be significantly reduced. Experimental results show that this method does not only improve weights sparsity but also align kernel weights sparsity across related layers. Our method is able to prune ResNet up to 90.4% of parameters and improve runtime by 1.5x speedup, without loss of accuracy.

Best Poster Finalist (BP): yes

Poster: pdf
Poster summary: PDF
Reproducibility Description Appendix: PDF

Back to Poster Archive Listing