One-Step Pruning - 3.5 English

Vitis AI User Guide (UG1414)

Document ID
UG1414
Release Date
2023-09-28
Version
3.5 English

One-step pruning implements the EagleEye #zgv1638974851316__li_l3s_zvn_dsb algorithm. It introduces a strong positive correlation between pruned models and their corresponding fine-tuned accuracy by a simple yet efficient evaluation component called adaptive batch normalization. It enables you to get the subnetwork with the highest potential accuracy without fine-tuning the models. In short, the one-step pruning method searches for a bunch of subnetworks (that is, generated pruned models) that meet the required model size and selects the most promising one. The selected subnetwork is re-trained to recover accuracy.

The pruning steps are as follows:

  1. Search for subnetworks that meet the required pruning ratio.
  2. Select a potential network from several subnetworks with an evaluation component.
  3. Fine-tune the pruned model.
Figure 1. One-step Pruning Workflow

Note: Bailin Li et al., EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning, arXiv:2007.02491