search-icon
Poster
:
Cross-Layer Group Regularization for Deep Neural Network Pruning
Event Type
Poster
Registration Categories
TP
EX
TimeThursday, November 15th8:30am - 5pm
DescriptionImproving weights sparsity is a common strategy for deep neural network pruning. Most existing methods use regularizations that only consider structural sparsity within an individual layer. In this paper, we propose a cross-layer group regularization taking into account the statistics from multiple layers. For residual networks, we use this approach to align kernel sparsity across layers that are tied to each other through element-wise operations: the ith kernel of these layers are put into one regularization group, they either stay or be removed simultaneously during pruning. In this way, the computational and parameter storage cost could be significantly reduced. Experimental results show that this method does not only improve weights sparsity but also align kernel weights sparsity across related layers. Our method is able to prune ResNet up to 90.4% of parameters and improve runtime by 1.5x speedup, without loss of accuracy.
Archive
Back To Top Button