Introduction

Automatic Digital Pre-distortion Design Generation for AI Engine (XAPP1391)

Document ID
XAPP1391
Release Date
2023-05-24
Revision
1.0 English

In 4G long-term evolution (LTE) and 5G new radio (NR) base stations, power amplifiers typically account for 70% of total power consumption in remote radio systems. The following figure shows a diagram of input and output power levels of a typical power amplifier (PA) whose operation region is from 0 to Pi1 in which the relationship between input and output power is almost linear. Though the output power can increase to Po2, the intermodulation and other undesirable distortions to the signal resulting from the non-linearity of PA would violate the requirements on adjacent channel interference and spectral emission mask. Digital pre-distortion (DPD) is a technology widely adopted in the industry to improve output power by applying a set of non-linear filters to the digital signal to compensate for the distortions. As a result, the linear region of the PA goes up from Po1 to Po2.

Figure 1. Input and Output Power of Amplifiers

A widely adopted DPD algorithm is based on the generalized memory polynomial (GMP) architecture proposed in A generalized memory polynomial model for digital pre-distortion of RF power amplifiers [4].

Figure 2. DPD GMP Model

The last equality combines the GMP terms of the same delay into a polynomial f(|x|) that can be pre-calculated and stored in a LUT. The complexity of DPD is proportional to the number of LUTs, which are indexed by ordered pairs (d, m).

The DPD sample rate should be three to five times that of the instantaneous signal bandwidth. As new frequency bands are being proposed for 5G and beyond, the instantaneous bandwidth is expected to exceed 500 MHz. Thus it becomes a challenge for programable logic (PL) to support a 2 GSPS or higher sample rate. AMD recently announced AI Engine for compute intensive functions such as beamforming, FFT, and DPD. See AI Engines and Their Applications (WP506) for more information. Though AI Engine is coded in C++ programming language, there is a learning curve for first-time users. For ease of use, a set of MATLABĀ® scripts are provided to generate the AI Engine design automatically based on the terms of Volterra series selected by the user. The design is ready for hardware validation with the PL kernels, C drivers, and V++ scripts also generated automatically. From the user's point of view, AI Engine becomes a customizable hardware accelerator that can be created in seconds without writing a single line of code.