Traditionally in a wireless communication system, the analog signal from the RF section could be demodulated into the I and Q components using analog techniques. But in this age of software-defined radios, an analog-to-digital converter (ADC) usually digitizes the downconverted intermediate frequency (IF) signal and then feeds it into the baseband section for demodulation and decoding. It’s possible to test the baseband receiver independently of the RF section by generating an appropriate IF signal using a signal generator with the output frequency set to the desired IF.

Measuring the output of the ADC poses a challenge because the output is now digital, and the standard spectrum analyzers typically used for receiver testing make measurements on analog signals. One solution is to analyze the digital bits from the ADC directly using a logic analyzer to capture the digital data. The difficult part of this approach is processing the captured data into a meaningful result since most logic analyzer applications aren’t focused on generating RF metrics.

However, specialized software such as the Agilent 89601A Vector Signal Analysis software can help. Although it’s commonly run in spectrum analyzers for demodulating various modulation formats, such software can be run in a logic analyzer, too. The software offers a unique way to analyze the ADC performance because it can make traditional RF measurements directly on digital data. It also lets designers quantify the ADC contribution to the overall system performance and compare results to earlier RF measurements using the same measurement algorithms.

With Long-Term Evolution (LTE), the output of the ADC may be converted to a high-speed serial interface such as Common Public Radio Interface (CPRI) in the Evolved Node B (eNB or basestation) or DigRF in the user equipment. In some cases, the topology of the receiver may allow access to the digital data only via one of these industry-standard buses, complicating the data analysis process.

Industry-standard solutions that accept these high-speed serial data streams are available, though. For example, Agilent offers a DigRF analyzer that can either analyze the digital data (as a logic analyzer would) or pass the data to the Vector Signal Analysis software, enabling vector measurement of the demodulated data.

Baseband Demodulation

The digitized signals from the ADC are transferred to the baseband section where FPGAs or ASICs perform signal demodulation. Up to this point, the measurements have been relatively straightforward because test equipment has been used for both the generation and analysis of the LTE signals. Now the receiver in either the eNB or user equipment (UE) must demodulate the signal and indicate the result.

In testing the baseband section of the receiver, the physical delivery of test signals to the device under test (DUT) is a concern. Depending on where the receiver is in the development cycle, the test signal could be injected into the receiver as an RF, IF, analog IQ, or digital IQ signal in the baseband section. Most signal generators can create signals for testing each of the different sections of a receiver.

The digital outputs of signal generators are generally raw I and Q samples with highly configurable physical characteristics including logic type, numeric format, number of bits, bite order, sample rate, and clock options. As noted above, most baseband LTE radio designs are expected to use a dedicated industry-standard digital interface such as CPRI or DigRF, so they will require specialized analysis tools. The bottom line is that the signal must be delivered to the receiver in the appropriate format.

Signal generators can provide timing signals or accept trigger signals to facilitate synchronization to either the UE or eNB receivers. To facilitate synchronization, information about the LTE signal being generated can be preprogrammed into the DUT. For instance, when UE is tested, it can be forced via preprogramming to use the physical-layer cell ID group and sector generated by the signal generator. It may be helpful to demodulate the test signal with a vector signal analyzer to verify that the test signal configuration matches the configuration expected by the receiver under test.

The demodulation and decoding algorithms in the baseband section can be verified with physical-layer coded LTE signals, which can be easily configured using a signal generator with an application such as Agilent Signal Studio for 3GPP LTE. The resource blocks configured at the channel and band edges are of particular interest because band and channel filters are likely to distort and attenuate part of the signal.

Although test signals are easily measured and interpreted using a vector signal analyzer with its multiple displays, the interface to a real LTE receiver is likely to be a simple terminal interface with proprietary commands and results. A useful feature of a receiver, then, is the ability to write the demodulated data from each channel to a file for post-analysis to ensure that the received bits match the transmitted bits.

Basic demodulation is verified at the physical- layer level. Once this step is complete, the transport channel decoding algorithms can be verified . The specifications define fixed reference channels (FRCs) that are used as reference configurations for defining receiver requirements. These signals are a good starting point for initial verification of the transport channel decoding algorithms. As an example, Figure 1 illustrates an uplink FRC for testing the eNB using the Signal Studio software.

BER Testing

After verifying that the receiver is correctly demodulating and decoding the signal, the designer can perform bit error ratio (BER) and block error ratio (BLER) measurements. Figure 2 shows a simulation of UE receiver demodulation performance for a 64-state quadrature amplitude modulation (64QAM) downlink shared channel (DL-SCH) as a function of Eb/No, which is the energy per bit divided by the noise power spectral density. This provides a general idea of the relationships between uncoded BER, transport channel coded BER, and BLER.

The uncoded BER is a measure of BER at the physical layer before transport channel decoding, while the coded BER is a measure of BER after the transport channel decoding. Uncoded BER is a more sensitive measure of receiver performance than BLER or coded BER and is useful during early phases of receiver characterization. The receiver requirements use BLER as the metric for performance, which is expressed in the form of a throughput relative to the maximum throughput of the FRC. The transport channel coding with forward error correction clearly improves BER performance.

BER measurement requires configuring a pseudorandom sequence for the payload data in the signal generator and making that sequence known to the receiver, which can then auto-correlate to it and calculate BER. Some signal generators can calculate the BER if the demodulated and decoded signal is routed back to the signal generator as a transistor-transistor logic (TTL) or CMOS signal.

Unlike the Universal Mobile Telecommunications System (UMTS) and earlier systems, LTE has no requirements based on BER, nor does it support the loopback mechanisms defined for measuring BER. All receiver conformance tests for both UE and eNB are based on BLER, and BER testing remains an R&D tool.

Summary

Open-loop testing using a vector signal generator and a vector signal analyzer is a convenient way to verify the ADC and baseband demodulation of an LTE baseband receiver. Complete testing of the UE and eNB BLER requirements, which are based on Hybrid Automatic Repeat Request (HARQ) retransmission, also will require closed-loop receiver testing.

Note: This article is derived from Chapter 6 of Agilent’s recent textbook, LTE and the Evolution to 4G Wireless.