Category |
Operations |
Performance Benefit |
|---|---|---|
Activation |
ReLU, PReLU, GeLU, Tanh, Sigmoid, SWISH |
Eliminates separate activation pass |
Scaling |
Scale, Clip |
Fuses quantization/normalization |
Addition |
Bias, Matrix Add |
Combines common DNN operations |
Multiplication |
Matrix Multiply |
Enables element-wise scaling |