While the market ultimately will decide which standards constitute the so-called “fourth generation” or 4G, presently IEEE’s WiMAX and 3GPP’s Long Term Evolution (LTE) are under scrutiny and development. Both are vying for the honor of being the mobile, broadband communications vehicle that will imbue handheld devices with an impressive range of features and services, such as high-speed Internet access, GPS, and telephony.

WiMAX comes to the market from the PC world, championed by Intel and a host of companies most associated with PCs and Wi-Fi. LTE, in contrast, evolved from the 2.5G and 3G efforts of the cell-phone carrier crowd. Anyone who has taken a serious look at the physical layer (PHY) of these candidate standards cannot help but come away feeling that they are converging toward a common-denominator set of technical features. In fact, it will probably be easier to build a common set of chipset solutions that encompass both WiMAX and LTE than one that encompasses, say, Wi-Fi and WiMAX.

Most experts say WiMAX is the most developed standard of the group. Standards 802.16d and 802.16e have been accepted, and basestation and client-system ICs, reference designs, and complete systems have been built and piloted. The missing ingredient, infrastructure, appeared to be the sticking point that hampered WiMAX’s progress in 2007 and 2008 in North America, but recent developments have brought renewed optimism.

Meanwhile, LTE has been chugging along. The latest standard version offers enough specifics to fuel basestation and handheld-device design. Along with those specifics, we’ve seen a marriage of instruments and specialized software to enable developers to make reference designs that meet LTE’s exacting requirements. For example, through a collaborative effort with LitePoint Corp., Tektronix now offers real-time spectrum analysis tools coupled with LitePoint control and analysis software that constitutes the first, purpose-built, LTE development test solutions.

No Time To Be Flying Blind

LTE is every bit as complex a standard as WiMAX. To achieve high data rates without exceeding 20-MHz bandwidths, both standards take advantage of orthogonal frequency-division multiplexing (OFDM) for downloading data from basestations to client systems and a large number of subcarriers. Frequency stability becomes more stringent as data rates increase and modulation changes from binary phase-shift keying (BPSK) to 64 quadrature amplitude modulation (64QAM). Both standards also impose strict limitations on spectral power density, which is necessary to avoid interference and allow multiple clients to share a basestation without excessive error rates.

When dealing with orthogonal subcarriers and a variety of modulation schemes, the number of individual tests can become overwhelming. Error-vector magnitude (EVM) measurement enables one substantive measure of modulation quality. It will be important to be able to apply EVM analysis to specific channels, for instance. The key is to gather sufficient insight into how well a design’s resulting signals conform to a 4G standard and to do so efficiently to enable comprehensive development testing without excessive delay due to inefficient testing methods.

During the development phase, it is critical that designers are able to identify signal failures and to understand the root cause, such as frequency instability, I and Q mismatch, and the like. Thus, designers need tools that help them identify and isolate signal problems.

In terms of measurement bandwidths, if the channel being analyzed is, say, 20 MHz, the testing tools must have sufficient measurement bandwidth to examine third- and fifth-order distortion products. If not, what looks like a reasonably “clean” signal may, in fact, have distortion products that pose interference problems. For the aforementioned 20-MHz channel, designers need up to 100 MHz of real-time capture plus sufficient dynamic range to characterize those fifth-order terms.

The dynamic generation of RF waveforms through digital signal processing (DSP) and the integration of digital and RF circuits, often on the same IC, create issues not seen in traditional RF transceiver designs. Simply passing conformance testing does not ensure a device will work properly. System behavior needs to be carefully and thoroughly observed since software is continually changing the system parameters. These software-controlled changes commonly cause glitches, intermittent interference, pulse aberrations, digital to RF couplings, software-dependent phase errors, and the like.

LTE system designers must fully analyze and characterize their systems. As parameters change over time, these designers must perform frequency-selective triggering to pinpoint the instant a transient event occurs. In addition, they must be able to perform multiple-domain time-correlated analysis to determine the specific causes of problems. Furthermore, by capturing the entire event seamlessly into memory, designers can perform subsequent in-depth analysis. One thing is certain, though. When transients occur, so does spectral leakage, and the result is frequency “splatter.”


LTE Fundamentals

WiMAX and LTE share some common fundamentals, but there are many differences as well. It is helpful to examine the LTE fundamentals to better understand what needs to be tested and how best to do it. Primarily, LTE describes a standard for high-speed data and media transport plus high-capacity voice support. As such, it supports high-speed data plus multimedia unicast and broadcast services.

The PHY is meant to be an efficient means to convey data and control information between a basestation and users’ mobile systems. It is intended to support scalable bandwidths of 1.25, 2.5, 5.0, 10.0, and 20.0 MHz with peak data rates that scale with bandwidth. So for 20-MHz channels, the basestation peak rate is 100 Mbits/s, and the mobile device peak rate is 50 Mbits/s.

Like Wi-Fi and WiMAX, LTE has adopted multiple-input multiple-output (MIMO) technology to increase data rate or increase range. As such, basestation LTE systems may employ up to four transmit antennas and two receive antennas in configurations of 4x2, 2x2, 1x2, and 1x1. Handheld mobile devices may have one transmit antenna and up to two receive antennas in a 1x2 or 1x1 configuration. With regard to range, a basestation is intended to provide full performance at a distance up to 5 km, with degraded performance out to 30 km and the possibility of usable operation out to 100 km.

LTE uses different downlink (DL) and uplink (UL) technologies. For DL (e.g., transmissions from basestation to mobile system), the standard calls for orthogonal frequency-division multiple access (OFDMA). For UL (e.g., transmissions from mobile unit to basestation), it specifies single-carrier frequency-division multiple access (SC-FDMA).

OFDM, OFDMA, and SC-FDMA

For LTE DL, OFDM breaks the channel’s bandwidth into separate subcarriers. Each subcarrier transmits its assigned data in parallel with the data transmitted by the other subcarriers. As a result, it produces a parallel data stream within the channel. Therefore, it can achieve high data rates without having to increase the symbol rates. This, in turn, helps avoid escalating inter-symbol interference (ISI).

The sub-carriers are modulated using QPSK, 16QAM, or 64QAM. Each OFDM symbol is a combination of individual signals on each subcarrier in that channel at any instant. Each OFDM symbol is preceded by a cyclic prefix whose purpose is to mitigate or eliminate ISI. The subcarriers are very tightly grouped for very efficient spectral efficiency. But because their individual frequencies are determined so they produce no signal in adjacent subcarriers, the inter-carrier interference (ICI) is virtually eliminated (Fig. 1).

In OFDMA, unlike an 802.11a, b, g, or n network, the channel users are given a specific number of available subcarriers for a predetermined amount of time. In contrast with CDMA/CD, multiple users can be sharing the same channel via predetermined subcarrier allocations (called physical resource blocks, or PRBs). The LTE basestation performs the PRB allocation based on frame structures specific to frequency-division duplexing (FDD) and time-division duplexing (TDD). Depending upon the channel bandwidth (1.25 through 20 MHz), the number of PRBs available varies from six to 100 (e.g., six, 12, 25, 50, 75, and 100).

In addition to its positive attributes, OFDM has some disadvantages, particularly with regard to handheld, battery-powered devices. It is susceptible to carrier-frequency errors from local-oscillator (LO) offset and/or Doppler shifts while in motion. It also has a high peak-to-average power ratio (PAPR).

With regard to frequency errors, ideally the carrier signal and receiver LO are at the same frequency. Both systems will undergo some drift, though. Therefore, active processing is required to keep them in sync. With Doppler shift, speed and path can cause frequent differences between the carrier and LO. With OFDMA, then, signal frequencies must be diligently and continuously monitored to avoid relatively long frequency mismatches and excessive ICI with increased packet errors.

When it comes to PAPR, the instantaneous transmitted RF power can vary dramatically within a single OFDM symbol. Subcarrier voltage adding in phase at some points within a symbol will produce high instantaneous peak power that is considerably higher than average power. Hence, one ends up with a high PAPR, which directly affects the dynamic range requirements for analog-to-digital and digital-to-analog converters.


A high PAPR also lowers the efficiency of a transmitter’s power amplifier. Unlike single-carrier systems, OFDM is not a constant-envelope modulation scheme where information is conveyed by varying frequency or phase while amplitude remains constant. While within each symbol the amplitude and phase of each subcarrier is constant, over the duration of that symbol, there may be many large peaks. That means the power amplifier has to accommodate those peaks without saturating (e.g., clipping). In turn, that means using a larger amplifier and the resulting decrease in efficiency. For this reason, in particular, LTE uses a different method for UL.

Power consumption is a key design consideration for users’ handheld devices. Thus, SC-FDMA is used instead of OFDMA for UL. The basic transceiver architecture is practically identical for OFDMA and SC-FDMA, and both methodologies provide about the same degree of multipath adaptation. But because the UL waveform is essentially single-carrier, the PAPR is lower.
A Common FDD Frame Structure

In both DL and UL, LTE uses the same frame structure (for FDD). Transmissions are segmented into frames, each of which is 10 ms in duration (Fig. 2). The frames, in turn, are made up of 20 0.5-m slot periods. And, subframes (1.0 ms in duration) contain two slots.

The primary difference between WiMAX and LTE is the SC-FDMA UL. Both proposed standards accommodate channel bandwidths that vary from 1.25 to 20 MHz. Also, both have access methods that enable multiple users to share the same channel. But WiMAX uses OFDM for both UL and DL. Although WiMAX can be deployed for FDD, TDD, and half-duplex FDD, its most common deployment uses TDD.
In both cases, WiMAX and LTE, the basestation is responsible for access allocations. As was seen with LTE, using FDD, mobile devices are given PRB allocations. With WiMAX, mobile devices are given time-dependent access slots.

LTE Analysis Test Approach

Beginning with Wi-Fi, then WiMAX, the industry has moved toward a platform approach to testing these increasingly complex wireless modalities. For example, LitePoint introduced its IQview platform in 2004 for testing Wi-Fi devices. IQview consisted of a vector signal generator and vector signal analyzer plus controls in the hardware portion of the platform. It also featured a comprehensive GUI software portion of the platform to enable users to perform single capture with multiple analysis and EVM on Wi-Fi devices. The same approach was used in 2006 to support the development and production testing of WiMAX devices.

For LTE, Tektronix and LitePoint jointly developed a unique platform. The hardware is based on Tektronix’s real-time spectrum analysis hardware and application programming interface (API) complemented by a LitePoint GUI tailored to that hardware. This Tektronix-LitePoint solution provides the capture and analysis needed to examine and eliminate intermittent events.

Transients and Intermittent Events

First and foremost, designers need to verify signal compliance to the LTE standard, whether this is compliance to a spectral emissions mask or EVM on a specific channel. When LTE signals fail compliance, designers need to understand why. They need tools to identify and ultimately isolate where the problem exists, whether it’s an algorithm problem or signal interference (Fig. 3). These examples have one thing in common. When these transients occur, spectral leakage will occur, resulting in frequency “splatter.”

Best Test Approach to LTE Analysis

Designers first need a tool to evaluate compliance for the UL and DL portions of an LTE signal. The real power of a real-time spectrum analyzer (RTSA) with the LitePoint software stems from its ability to capture and analyze intermittent events (Fig. 4). By combining digital phosphor technology (DPX) for discovering all the transients for troubleshooting with the RTSA’s frequency mask trigger (FMT), designers can:

Figure 4 is the same LTE signal shown in two different spectrum-analysis modes. The spectrum on the left is a typical vector signal analyzer (VSA) display providing spectral updates of about 10 times per second. Note how the signal does not show any anomalies. If the trace were updating live, the user might observe an occasional “bump” in the upper and lower adjacent channel levels indicating that something could be wrong with the signal, though it is impossible to understand the nature of the problem.

The signal on the right uses a real-time feature of DPX allowing the user to process more than 48,000 spectrums per second. This dramatic increase in the spectral update rate combined with DPX’s pixel memory buffer allows the user to not only see the transient behavior of the signal, but to also observe the frequency of occurrence as noted by the color grading. DPX enables designers to discover if there are any anomalies with their waveform.

Once discovered, the key to any debug and diagnosis is to capture events of interest. Tektronix now has incorporated versatile triggering to real-time spectrum analysis. Frequency mask triggering allows users to draw a free-form mask on the spectrum display and trigger the instrument when the spectrum of the signal violates this mask. The mask can be created with up to 500 points across the frequency span and up to 80 dB below the reference level.

Masks can be defined graphically or in tabular form. They can be saved independently and re-used at other center frequencies. Masks can also be auto generated using the Auto Draw button in the frequency mask editor menu. This allows users to use a known-good spectrum as a baseline for testing potentially noncompliant waveforms.

Figure 5 shows the steps that LTE design engineers can take to trigger any spectrum violations in the frequency domain. In addition to frequency mask triggering, the hardware offers power-level triggering, either across the entire acquisition bandwidth (function of the span) or in a narrow bandpass centered on center frequency (poor man’s frequency mask triggering).

Finally, the hardware also features a couple of electrical trigger inputs to trigger on external stimuli. When the RTSA gets triggered from any of these modes, it squirts out a trigger output pulse that can be used to cross-trigger other instruments. This gives the RF anomaly a synchronization that greatly aids in the debug and troubleshooting effort of the system.

Finally, LTE is a highly configurable signal format. Symbol rate, modulation type, frame, and burst configurations can change every frame. Engineers can waste a lot of time configuring the test equipment. LitePoint’s RSALTE software and Tektronix’s RTSAs are integrated to yield one-button measurements so users spend less time configuring test equipment and more time troubleshooting designs.

Circumventing Configuration Complexity

Despite the complexity of WiMAX and LTE, designers are running at full tilt in basestation and handheld mobile development mode. Powerful real-time spectrum analysis tools provide a real advantage, especially those like the Tektronix RSA6100A and RSA3000 series with unique trigger and spectrum display capabilities. The comprehensive, complementary RSALTE software tool adds to that advantage by providing one-button measurements (Fig. 6).

Using Tektronix’s RTSAs and LitePoint’s RSALTE software, designers can examine OFDM DL and SC-FDMA UL signals. A combination of Tektronix hardware features and RSALTE software controls enables users to quickly identify troublesome transients and get a quick handle on probable cause.

The RSALTE display shows power spectral density and mask, symbol constellation, and spectral flatness. For troubleshooting help, the display illustrates phase error versus time, frequency error versus time, complementary cumulative distribution function (CCDF), spectrogram, and EVM of individual subcarriers.
Conclusion

Designers need instruments that can discover and capture the transients for LTE design troubleshooting and analysis. The DPX available from RTSAs offers an intuitive live color view of signal transients changing over time in the frequency domain. It also can instantly display a fault when it occurs. The FMT technology then can be used to set to trigger on the event in the frequency domain, capture a continuous-time record of changing RF events, and perform time-correlated analysis in all domains. Effective discover, trigger, and capture capability along with the analysis capability that supports the latest March 2008 LTE release are the key considerations for LTE designers.


John Lukez is director of product management at LitePoint Corp. He is an RF and wireless measurement expert in a range of communication standards. He holds a degree in electrical and computer engineering from the Ohio State University. He can be reached at John.lukez@litepoint.com.
Li Cui is product marketing manager for Tektronix’s Real Time Spectrum Analyzer products. He has had over 10 years of experience in related RF product areas. Li holds a BSEE from Beijing University of Technologies in China. He can be reached at li.cui@tektronix.com.

Related Articles:

  1. 4M Licenses LTE Protocol Stack To 4G Multimode Baseband Chipset Company
  2. Long-Term Evolution—What’s In It For You?
  3. HARQ Process Boosts LTE Communications
  4. GSA Confirms 26 Operators Committed To LTE