A Survey on Deep Neural Network Pruning-Taxonomy, Comparison, Analysis, and Recommendations
Modern deep neural networks, particularly recent large language models, come with massive model sizes that require significant computational and storage resources.
Tags:Paper and LLMsAdversarial Robustness Network PruningPricing Type
- Pricing Type: Free
- Price Range Start($):
GitHub Link
The GitHub link is https://github.com/hrcheng1066/awesome-pruning
Introduce
The GitHub repository “awesome-pruning” is a comprehensive collection of neural network pruning research and open-source code. The repository covers various aspects of pruning neural networks, including static and dynamic pruning, learning and pruning strategies, and applications in computer vision, natural language processing, and audio signal processing. The repository organizes pruning methods based on different criteria, such as timing of pruning and specific techniques, and provides an extensive list of relevant papers and associated resources. The work aims to offer a valuable resource for researchers and practitioners interested in the field of neural network pruning.
Modern deep neural networks, particularly recent large language models, come with massive model sizes that require significant computational and storage resources.
Content
Taxonomy: In our survey, we provide a comprehensive review of the state-of-the-art in deep neural network pruning, which we categorize along five orthogonal axes: Universal/Specific Speedup, When to Prune, Pruning Criteria, Learn to Prune, and Fusion of Pruning and Other Techniques.

Related
Although unsupervised approaches based on generative adversarial networks offer a promising solution for denoising without paired datasets, they are difficult in surpassing the performance limitations of conventional GAN-based unsupervised frameworks without significantly modifying existing structures or increasing the computational complexity of denoisers.
Based on the modeling method, we present FocusFlow, a framework consisting of 1) a mix loss function combined with a classic photometric loss function and our proposed Conditional Point Control Loss (CPCL) function for diverse point-wise supervision; 2) a conditioned controlling model which substitutes the conventional feature encoder by our proposed Condition Control Encoder (CCE).







