The following table lists the supported operations and APIs for vai_q_tensorflow2.
Layer Types | Supported Layers | Description |
---|---|---|
Core | tf.keras.layers.InputLayer | |
Core | tf.keras.layers.Dense | |
Core | tf.keras.layers.Activation | If 'activation' is 'relu' or
'linear',
will be quantized. If 'activation' is 'sigmoid' or 'swish', will be converted to hard-sigmoid or hard-swish and then be quantized by default. Otherwise will not be quantized. |
Convolution | tf.keras.layers.Conv2D | |
Convolution | tf.keras.layers.DepthwiseConv2D | |
Convolution | tf.keras.layers.Conv2DTranspose | |
Pooling | tf.keras.layers.AveragePooling2D | |
Pooling | tf.keras.layers.MaxPooling2D | |
Pooling | tf.keras.layers.GlobalAveragePooling | |
Normalization | tf.keras.layers.BatchNormalization | By default, BatchNormalization layers are fused with
the previous convolution layers. If they cannot be fused, they are
converted to depthwise convolutions. In the QAT mode, BatchNormalization layers are pseudo fused if train_with_bn is set to TRUE. They are fused when the get_deploy_model function is called. |
Regularization | tf.keras.layers.Dropout | By default, the dropout layers are removed. In the QAT mode, dropout layers are retained if remove_dropout is set FALSE. It is removed when the get_deploy_model function is called. |
Reshaping | tf.keras.layers.Reshape | |
Reshaping | tf.keras.layers.Flatten | |
Reshaping | tf.keras.UpSampling2D | |
Reshaping | tf.keras.ZeroPadding2D | |
Merging | tf.keras.layers.Concatenate | |
Merging | tf.keras.layers.Add | |
Merging | tf.keras.layers.Muliply | |
Activation | tf.keras.layers.ReLU | |
Activation | tf.keras.layers.Softmax | The input for the Softmax layer is quantized. It can run on the standalone Softmax IP for acceleration. |
Activation | tf.keras.layers.LeakyReLU | Only 'alpha'=0.1 is supported on the DPU. For other values, it is not quantized and mapped to the CPU. |
Hard_sigmoid | tf.keras.layer.ReLU(6.)(x + 3.) * (1. / 6.)) | The supported hard_sigmoid is from
Mobilenet_v3. tf.keras.Activation.hard_sigmoid is not supported now and will not be quantized. |