#include "xf_data_analytics/classification/decision_tree_predict.hpp"
template < typename MType, unsigned int WD, unsigned int MAX_FEA_NUM, unsigned int MAX_TREE_DEPTH = 20, unsigned MAX_CAT_BITS = 8 > void decisionTreePredict ( hls::stream <ap_uint <WD>> dstrm_batch [MAX_FEA_NUM], hls::stream <bool>& estrm_batch, hls::stream <ap_uint <512>>& treeStrm, hls::stream <bool>& treeTag, hls::stream <ap_uint <MAX_CAT_BITS>>& predictionsStrm, hls::stream <bool>& predictionsTag )
decisionTreePredict, Top function of Decision Tree Predict.
This function first loads decision tree (the corresponding function : getTree) from treeStrm Then, read sample one by one from dstrm_batch, and output its category id into predictionsStrm streams
Note that the treeStrm is a 512-bit stream, and each 512 bits include two nodes. In each 512-bit confirm the range(0,71) is node[i].nodeInfo and range(256,327) is node[i+1].nodeInfo the range(192,255) is node[i].threshold and range(448,511) is node[i+1].threshold For detailed info of Node struct, can refer “decision_tree.hpp” Samples in input sample stream should be converted into ap_uint<WD> from MType
Parameters:
MType | The data type of sample |
WD | The width of data type MType, can get by sizeof(MType) |
MAX_FEA_NUM | The max feature num function can support |
MAX_TREE_DEPTH | The max tree depth function can support |
MAX_CAT_BITS | The category max bit number |
dstrm_batch | Input data streams of ap_uint<WD> |
estrm_batch | End flag stream for input data |
treeStrm | Decision tree streams |
treeTag | End flag stream for decision tree nodes |
predictionsStrm | Output data streams |
predictionsTagStrm | End flag stream for output |