One-step pruning implements the EagleEye #zgv1638974851316__li_l3s_zvn_dsb algorithm. It introduces a strong positive correlation between pruned models and their corresponding fine-tuned accuracy by a simple yet efficient evaluation component called adaptive batch normalization. It enables you to get the subnetwork with the highest potential accuracy without fine-tuning the models. In short, the one-step pruning method searches for a bunch of subnetworks (that is, generated pruned models) that meet the required model size and selects the most promising one. The selected subnetwork is re-trained to recover accuracy.
The pruning steps are as follows:
- Search for subnetworks that meet the required pruning ratio.
- Select a potential network from several subnetworks with an evaluation component.
- Fine-tune the pruned model.
Figure 1. One-step Pruning Workflow
Note: Bailin Li et al., EagleEye: Fast Sub-net
Evaluation for Efficient Neural Network Pruning, arXiv:2007.02491