Consider the signal waveform shown in Figure 3
where input samples are evenly distributed at an interval T. One desirable output sample
is located between x_{n-3}
and x_{n-2}
with a timing offset *u*, which can be any real number between 0 and 1. For all
finite impulse response (FIR) filters, the output can be written as a linear combination
of the input samples as:

_{k}} is the set of coefficients. In the case of ARF, to account for dynamic timing offsets of output samples, c

_{k}becomes a function of the timing offset u:

where

*f*(·) generalizes the discrete coefficients of a low-pass filter to a continuous function in the domain of real numbers. One method of implementing the continuous function

*f*(·) is to prestore an array {

*F*} in a memory where

_{k}Then, approximate *c _{k}(u)* by a linear interpolation of two nearest prestored values

where [*x*] is the floor function that gives the largest integer less
than or equal to *x* and

{*F _{k}
*} and
{

*G*} can be implemented by two look-up tables addressed by

_{k}*s*. The example in the following figure has L = 6 and P = 4.

For an ARF with a maximum interpolation ratio K, at most ceil(K) new output samples can be computed from one input. The following figure shows the case when K = 2 and either one or two outputs are computed from every new input sample.

The complexity of an ARF is dominated by the computation of Figure 1 and Figure 2. The former needs two L real-to-real multiplications for each output sample, and the latter needs an additional L real-to-real multiplications. Because the output sample rate is K times that of input, the total number of real-to-real multiplications is 3·L·K·Input_Sample_Rate.

This gives an estimate for the minimum number of AI Engines required by the ARF.