TEE/Global Platform - 2024.1 English

AI Engine System Software Driver Reference Manual (UG1642)

Document ID
Release Date
2024.1 English

In the case where AI Engine is configured as secure and an unsecure application needs to request service on the secure AI Engine, global platform standard can be used. GlobalPlatform is the standard for secure digital services and devices.

GlobalPlatform is a non-profit member driven technical association. It standardizes frameworks to ensure devices to be secure enough to protect against threats and attacks. It enables AI Engine to deliver secure digital services to the end users. TEE is a project under GlobalPlatform.

A trusted execution environment (TEE) is an environment that runs alongside a rich operating system and provides security services to that rich environment. A TEE provides an execution environment with secure capabilities, which are either available to trusted applications running inside TEE or exposed externally to client application.

TEE provides shared memory between client application and trusted application interfaces, such as client application authentication, atomic field access from client application, and client application multi-threading. The following is a diagram showing the TEE client API architecture.

Figure 1. TEE Client API Architecture

In the case of the AI Engine application, some AI Engine operations need to run in a secure environment due to the need to access privileged registers or system policies. Currently, EEMI provides the interface for a non-secure application to raise requests to PLM. TEE provides a more generic solution for a non-secure client application to use the service from a secure trusted application, regardless of where the trusted application runs. It enables the trusted application runs out of PLM and can run on APU secure EL0, RPU, or PSM.

There is TEE driver framework in Linux kernel to enable user to manage TEE context, TEE sessions within the context, shared memory between the trusted application and the client application and the remote procedure calls (RPC) from the trusted application. There is open source OP-TEE implementation for reference.

In the case of AI Engine application, you can use the FPGA manager to open a TEE context for each FPGA region. Each FPGA region can contain AI Engine only partition,AI Engine + PL partition, or PL only partition. Each AI Engine application can open sessions with the AI Engine TEE context. With the AI Engine TEE sessions, AI Engine application can sends commands to the trusted application to request AI Engine service at runtime. The AI Engine driver needs to run in the trusted application to interact with the AI Engine hardware and serve the client application requests. An AI Engine client driver is required in the non-secure context, like the Linux userspace, to pass service requests from the application to the AI Engine driver running in the secure environment.

The following are benefits to using TEE:

  • The solution follows global platform specification. Global platform has been used in IoT, connected cars, smartphones, and tablets and more. Following the standard specification makes the solution more portable across different platforms and OSes.
  • The solution provides a safe way for non-trusted clients to request service from trusted applications.
  • The solution provides isolation between non-trusted clients.
  • It provides a single solution to solve the following issue: Dynamic compute resource allocation between different applications even run on different processors, or VMs.

The following are disadvantages to using TEE:

  • The solution is complex. AMD Versalâ„¢ developers need to learn new components and client developers might need to learn the new components.
  • The solution requires maintenance.
  • The solution has runtime overhead which means extra context switches are required for a simple single user.