One-step Pruning - 3.0 English

Vitis AI Optimizer User Guide (UG1333)

Document ID
UG1333
Release Date
2023-01-12
Version
3.0 English

One-step pruning implements the EagleEye 1 algorithm. It introduces a strong positive correlation between different pruned models and their corresponding fine-tuned accuracy by a simple, yet efficient, evaluation component called adaptive batch normalization. It enables you to get the subnetwork with the highest potential accuracy without actually fine-tuning the models. In short, the one-step pruning method searches for a bunch of subnetworks (i.e., generated pruned models) that meet the required model size, and selects the most promising one. The selected subnetwork is then retrained to recover the accuracy.

The pruning steps are as follows:

  1. Search for subnetworks that meet the required pruning ratio.
  2. Select a potential network from a bunch of subnetworks with an evaluation component.
  3. Fine-tune the pruned model.
Figure 1. One-step Pruning Workflow

Note:
  1. Bailin Li et al., EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning, arXiv:2007.02491