The road to successful Long-Term Evolution (LTE) deployment is fraught with pitfalls. For one thing, we have never experienced the simultaneous rollout of so many relatively new fundamental technologies. For another, we have learned from hard experience as technology introductions that were even less involved introduced hazards that were both unforeseen and costly.

With this in mind, the early deployers of LTE have developed plans meant to mitigate the expensive risks of new-technology deployment, even while pushing the envelope of possibilities. These operators and device manufacturers generally agree that there are four significant “pain points” or areas requiring special attention during the design and deployment processes.

Why does LTE require specialized concentration in these areas? First, LTE uses advanced radio technologies on the link between the network and subscriber devices. Orthogonal frequency-division multiple access (OFDMA ) and single-carrier frequency-division multiple access (SC-FDMA) are conceptually proven ideas that have never been deployed on the scale projected for LTE deployment, but they will be rolled out along with an all-new, all-IP (Internet Protocol) network. Multiple- input multiple- output (MIMO) is not new, but it is both new to cellular providers and critical to the success of LTE.

This is quite unlike previous wireless rollouts, which were usually incremental steps based on proven fundamentals. For example, we developed wideband code-domain multiple access (WCDMA) while looking at CDMA2000 (CDMA-based third-generation technology) for examples of pitfalls and advantages. The pioneers of CDMA2000 had a baseline in cdmaOne, which was itself based on military spread spectrum. In contrast, the breadth and depth of changes deployed in LTE requires painstaking fundamental R&D and testing processes in many separate areas at once.

Continue to next page.

Second, the continual increase in demand for data throughput means that network operators are taking unprecedented global interest in the capabilities and characteristics of the devices being deployed on the networks. North American operators have always been very involved in device testing since they have traditionally controlled device distribution. This level of cooperation is often cited as one reason that these operators tend to produce very impressive profit growth.

In most of the world, where the traditional GSM model meant that any certified device could wind up on any network, operators are noticing that closer cooperation with device manufacturers can lead to higher revenues, lower operational costs, and lower support costs. Now some operators, as well as device makers, adopt testing processes centered on the mobile device. The consistent profitability of operators who do this has attracted a lot of attention.

Pain Point 1: Radio-Link Issues (Besides MIMO)

Testing LTE mobile devices requires substantial accounting for a whole new scope of radio-based variables. While the general public talks about LTE in terms of high-level data applications, engineers know that the radio access layer is the foundation of everything else that happens from the subscriber’s point of view.

The big news at the radio link is MIMO, but that is a pain point unto itself. Other risky areas include band-specific issues. As just one example, some early LTE deployments will be rolled out in the 700-MHz bands that attracted so much attention when they were auctioned in the U.S. a couple of years ago.

OFDMA (which adds a few intricacies to the orthogonal frequency-division multiplexing, or OFDM, technology deployed with WiMAX services) and SC-FDMA will soon see widespread deployments in this new spectrum, which was so highly valued for its propagation qualities. All else being equal, a lower-frequency signal will be coherent from longer distances, letting operators increase coverage at lower cost.

However, there’s no free lunch. In the early LTE deployments scheduled in the U.S., some spectrum sits very close to bands reserved for public safety use. “Splattering” signal into these bands places the operator at risk of some hefty penalties from the regulator.

Continue to next page.

And there’s more. One of the bands marked for LTE deployment sits squarely at a frequency that is one-half of a commonly used GPS signal frequency. Harmonic-inducing distortion might interfere with the GPS receiver built into the mobile device, and therefore it must be tested. At a more detailed level, the methods operators use to eliminate this interference must be tested (e.g., the over-provisioning of the physical uplink control channel, or PUCCH, to eliminate egress into the public safety band).

As always, other subtleties may not spring to mind when testing new technologies. For example, it is considered de rigueur to test any receiver design in the presence of noise. However, the qualities of emulated noise used in testing must be specifically designed to represent real-world signals as seen by the new receiver design. This has led to an entirely new proposed emulated noise definition to be used in LTE testing.

Pain Point 2: MIMO

MIMO technology is one key to unlocking the higher data rates expected from LTE. Resources in both the frequency and time domains can be reused by increasing the number of antennas on both the mobile device and on network equipment.

MIMO also adds a very new dimension to long-term LTE test plans. Suddenly the orientation and spacing of antennas within the device are significant factors in the device’s performance. This means that traditional conducted-mode testing, where test equipment is cabled to the antenna of the device under test, must be supplemented with newer methods.

These new methods start with over-the-air (OTA) testing. This is not “field testing,” which yields results based on field scenarios that can’t be reliably repeated. While several methods of OTA testing have been investigated, the one that holds the most promise is the use of multiple transmitting antennas mounted in an anechoic chamber (Fig. 1). In this method, the signal to each antenna is run through a wireless channel emulator. This combination allows the creation of a very realistic yet repeatable RF environment that can be used to test mobiles.

The industry is also investigating an exciting new concept called “Virtual MIMO OTA.” In Virtual MIMO OTA testing, a mobile device is characterized in a multi-antenna anechoic chamber as described above. In characterization, the device’s RX sensitivity is measured as it is exposed to a series of signals transmitted from various angles.

Continue to next page.

Once data is collected, wireless channel emulators can be used to replicate the same effects in a conducted test scenario where the RF paths are directly cabled to the device under test (DUT). This approach offers a lot of promise, since the device only needs to be mounted in the chamber once, rather than every time the test is run.

Another critical area involving MIMO is the idea of dynamic testing. A MIMO receiver (such as a MIMO-capable mobile device) is very sensitive to the orientation of its antennas in relation to the transmitter’s antennas. While it can be relatively simple to test a receiver at one orientation and then another, there is another, easy-to-miss test case requirement. What happens to receiver performance as the relative orientations change ?

The capacity of a MIMO connection depends not only on the capabilities of the individual RF paths, but also on the complex relationships between paths. In this context, the concept of “correlation” can be thought of as a measure of how well the system can distinguish between paths. They are expressed mathematically as complex matrices. In practice, these correlations or matrices vary quickly and significantly.

The only feasible way to meet this testing challenge is to incorporate wireless channel emulation that can affect these changes to the RF paths in real time. In an automated system, then, the way to meet this challenge is with a test stand that fully implements this capability and incorporate s enough flexibility to meet requirements yet to come. More details in this area of testing are available from the 3rd Generation Partnership Project (3GPP) and from the European Cooperation in Science and Technology Action on Pervasive Mobile & Ambient Wireless Communications (COST2100).

All of this is aggregated with the fundamental complexities involved with wireless fading and spatial channel modeling. While test equipment designs can shield the user from many of the intricacies involved, it can be useful to have a fundamental idea of the principles. For more information, download free white papers from

Pain Point 3: Multi-Band Multi-Mode Devices

One pressing issue is the fact that for the next several years at least, LTE will coexist with legacy networks. And not only will they coexist, but subscriber devices also will have to transparently shift technologies as they move in and out of LTE service areas. Subscribers expect seamless and transparent handovers while connected, and they expect to be free of unwarranted roaming charges.

While LTE offers an opportunity for realizing the International Telecommunication Union’s 20-year-old vision of a globally accepted cellular standard, the group’s hopes for a global spectrum standard never came true. Depending on the regions targeted for use by the device, testing requires testing with multiple cellular technologies deployed in various bands, such as the 700- MHz band in North America and the expected adoption of the 2.6- GHz band in Europe.

Continue to next page.

Coexisting with legacy networks means that in the very near future we will see the commercial release of devices that can operate on WCDMA, high-speed packet access (HSPA), code-domain multiple access (CDMA), evolution data-only (EV-DO), and LTE networks. Many will also incorporate Wi-Fi and Bluetooth transceivers as well as satellite navigation receivers. The dual-mode or tri-mode devices of years past have evolved into multi-mode devices that require specialized testing.

Validating multi-mode conformance and performance falls into a broad category labeled radio resource management (RRM) testing. The first sub-category of RRM testing, multi-mode system selection (MMSS) testing, verifies that a device always picks the “best choice” signal when multiple signals from multiple technologies are available. This sounds simple, but verifying this in the lab requires the availability of highly controllable emulated networks that will appear at the device’s antenna as real networks radiating signals over the air.

Examples of standards-based RRM testing are the tests from Reference 1 called “E-UTRAN FDD-FDD intra frequency event triggered reporting with DRX=40ms under fading propagation conditions in synchronous cells” and “E-UTRAN FDD UE Timing Advance Adjustment Accuracy.”

Of course, there is much beyond simply testing for proper cell/technology/network selection. Devices are held responsible for making completely transparent in-service network transitions. The handover and blocking tests defined in Reference 1 are necessary starting points, but successful operators and device makers will expand these minimum test requirements to ensure operation under conditions that are likely to be encountered on live networks.

Both cell-selection/re-selection and handover testing are not new, of course. Adding the complexities of LTE to traditional multi-technology testing represents a vast leap in the complexity of device testing . But even the addition of a single new technology creates multiple new possibilities for transition and interference cases. For example, a tri-mode device creates three inter-RAT (radio access technology) possibilities. Adding a fourth mode creates six inter-RAT possibilities, and adding a fifth bumps the number of inter-RAT possibilities to 10.

These vast increases in the complexities of testing apply not only to the DUT, but also to the test stands being used. For one thing, devices are now responsible for receiving a much wider variety of frequencies. For another, there is a much greater disparity among the radio-layer types of signals being received. The complexity of the added requirements requires some definite foresight and planning.

Pain Point 4: Data

LTE and its immediate predecessors have always been data-centric technologies, but LTE will be the conduit for mountains of data every second. This is only partially a function of LTE’s ability to deliver higher rates. It is mainly the result of new devices designed to invite data use. Projected growth rates for data use are staggering and, quite frankly, frightening to the operators who are expected to deliver it. Without the imminent deployment of more efficient data-carrying capabilities, today’s cellular networks would eventually choke.

Continue to next page.

To fully evaluate the data capability of a device, the device must first be quantified in terms of its rate capacities using different protocols, such as File Transfer Protocol (FTP) and User Datagram Protocol (UDP). This kind of testing must be run on both the uplink and downlink as well as in bi-directional mode. Second, since the radio environment is such a significant factor in device performance, similar testing must be run under various radio conditions such as with a TX diversity signal, under radio faded conditions and with a spatially multiplexed radio link.

Finally, the same test stand can be used to quantify maximum data rates under swept-power conditions. Swept-power testing is a very useful procedure whereby signal power is slowly decreased while data rates are measured. This becomes an extremely handy metric. Operators can use these measurements to roughly calculate the differences in the cost of network support among various devices.

Over the past two years, this has become a highly visible measurement as technology analysts have discovered wide variations in data performance (and therefore on the strain on both network resources and customer support services) among chipsets and commercially available devices. While these tests were performed using legacy technologies, it is expected that the same effects on LTE networks will be even more cost-significant.

Some of the more cost- and risk-conscious operators uncovered another data-oriented test case years ago: data retry testing. As subscribers become enamored of a wider range of applications written by a broader number of developers, it becomes harder to ensure against rogue applications, which demand inappropriate shares of network resources under certain conditions.

Suppose an application periodically checks into a network-based server. What happens if the server goes down? Will the application know not to launch into an infinite loop of connection requests? If the answer is no, and the application is a popular one, the hosting network can quickly be brought to its knees trying to service an unserviceable demand (Fig. 2).

The inherent danger can be mitigated by implementing data retry testing. This testing verifies that no matter what kind of network issue prevents connection (network resource or bandwidth exhaustion, simple connection issues), the device’s internal timers will properly control the frequency of connection attempts. References 2 and 3 provide one example of an excellent combination of requirements and risk mitigation.


The promise of LTE is a great one: faster data delivered in a fiscally responsible way. The excitement generated by this promise comes with a set of technical hurdles, mostly related to the fundamental business of delivering large amounts of data over a wireless link. On the face of it, the scale of risk in four main pain points can seem a bit overwhelming when considered in the light of aggressive rollout schedules and the importance of getting them right.

There is some encouraging precedent, however, in ways to mitigate some fairly substantial and involved risks. Operators who have met these pain points head-on with historical insight, cooperation with handset vendors, and detailed plans to address specifics have traditionally enjoyed the fiscal fruits of these methods and have paved the way for others to follow.


1. 3GPP TS 36.521: “User Equipment (UE) conformance specification Radio transmission and reception”

2. Verizon Wireless Device Requirements-LTE Data Retry

3. Verizon Wireless Compliance Test Plan-LTE Data Retry Test Plan