The Vitis™ AI Library provides a way to read model parameters by reading the configuration file. It facilitates uniform configuration management of model parameters. The configuration file is located at /usr/share/vitis_ai_library/models/[model_name]/[model_name].prototxt.
model
{
name: "yolov3_voc"
kernel {
name: "yolov3_voc"
mean: 0.0
mean: 0.0
mean: 0.0
scale: 0.00390625
scale: 0.00390625
scale: 0.00390625
}
model_type : YOLOv3
yolo_v3_param {
…
}
is_tf: false
}
Model/Kernel | Parameter Type | Description |
---|---|---|
model | name | Same as ${MODEL_NAME}. |
model_type | Type of model used. The following types are supported.
|
|
kernel | name | The result of your DNNC compile. This may have an extra postfix
_0 . Include the postfix with the name, for example,
inception_v1_0. |
mean | Three lines corresponding to the mean-value of “BGR”, which are predefined in the model. It is listed in "BGR" order. | |
scale | Three lines corresponding to the RGB-normalized scale. It is listed in "BGR" order. If the model had no scale in training stage, this value should be 1. | |
is_tf | Boolean type. If your model is trained by TensorFlow, set the value to TRUE. It could be blank in the prototxt or set as FALSE if the model is Caffe or PyTorch. |