Vitis Model Composer Hub Block for Verification - 2022.2 English

Vitis Model Composer User Guide (UG1483)

Document ID
UG1483
Release Date
2023-01-13
Version
2022.2 English

To verify the AI Engine design, you need to enable the Create testbench option on the Hub and then choose the simulator options available, as shown in the following figure.

Figure 1. Vitis Model Composer Hub: Testbench and Simulator Options

Enabling the Create testbench option only logs the test data, or stimulus at the input of your design, and the simulation results as test vectors for later use as "golden" results. By default, no simulator is selected and to verify the design, you need to select the Run cycle-approximate AIE Simulation (SystemC) after code generation option.

The Simulation timeout value limits the execution to the specified number of cycles. This is necessary because of the finite amount of input data - if the timeout value is not specified, the AI Engine kernels are invoked repeatedly forever (i.e., the graph runs infinitely). To avoid this situation, select the Run cycle-approximate AIE Simulation (SystemC) after code generation check box and specify the Simulation timeout value as shown in the following figure.

Figure 2. AIE Simulation: Timeout Value

The default timeout value is set to 50,000 cycles and this value is used to terminate the simulation after the specified number of clock cycles.

When only the Create testbench option is enabled, and the code is generated, the structure and contents of the target directory created by Model Composer consists of the following files (not a comprehensive list) in addition to the list mentioned in Code Generation.

Table 1. Target Directory
Directory/File Sub-directory/File Description
data/ input/ Contains files that capture the input stimuli from Simulink.
  reference_output/ Contains files that capture the simulation output from Simulink.

When the simulator option is enabled, the AI Engine code verification advances in three phases:

  1. Compiling the AI Engine graph design.
  2. Running simulation using the AI Engine simulator.
  3. Verifying the simulation results by comparing the output with the golden reference output.

After clicking Generate, you can monitor the compilation, simulation, and verification progress of the AI Engine graph code from the Progress window (see the following figure).

Figure 3. Graph Code Progress

After successful compilation and simulation, Model Composer automatically compares the target output with the golden output and returns the following message in the Progress window (or in the corresponding simulation log files).

Comparing simulation results ...
Output data file : data/aiesimulator_output/Outl.txt.mod
reference data file : data/reference_output/Outl.txt
Simulation results MATCH.
**********************************************************
Test PASSED
Verification Complete
Note: In some scenarios, the simulator output produces fewer samples when compared with the golden output or vice-versa. In such cases, the test result will be still show as 'PASS.' This indicates that the first 'n' lines from the simulation output matches the first 'n' lines of the golden output and the results are partially matched. The .diff file in the corresponding simulator output captures any difference with the reference output.
Table 2. Target Directories
File Name/ Parent Directory Filename/Sub-directory Description
data/ aiesimulator_output/ Contains .txt files that captures the output of the AI Engine simulation. In addition to the actual output file, the .diff file is also generated to capture any difference from the golden result.

In addition to this, aiecompiler writes various configuration and binary files to the Work_aiesim directory. For more information on the structure and contents of the directory specific to the compilation, refer to the AI Engine Tools and Flows User Guide (UG1076).