After the training is completed, you can conduct an evolutionary search based on the neural-network-twins to get a subnetwork
with the best trade-offs between FLOPs and accuracy using a minimum and maximum FLOPs
range.
pareto_global = ofa_pruner.run_evolutionary_search(ofa_model, calibration_fn, (train_loader,) eval_fn, (val_loader,), 'acc1', 'max', min_flops=230, max_flops=250)
ofa_pruner.save_subnet_config(pareto_global, 'pareto_global.txt')
The searching result looks like the following:
{
"230": {
"net_id": "net_evo_0_crossover_0",
"mode": "evaluate",
"acc1": 69.04999542236328,
"flops": 228.356192,
"params": 3.096728,
"subnet_setting": [...]
}
"240": {
"net_id": "net_evo_0_mutate_1",
"mode": "evaluate",
"acc1": 69.22000122070312,
"flops": 243.804128,
"params": 3.114,
"subnet_setting": [...]
}}