Wireless Everywhere? Not Quite Yet...
Consumers now take wireless technology for granted, but standards, power, and other issues remain in the quest for universal cutting-edge functionality.
When Qualcomm wanted to trademark the term “ubiquity” in the 1980s, the notion of always-available mobile voice service seemed a bit far-fetched—so much so that a Motorola executive responsible for the company’s new cellular communications product group, in response to an industry analyst’s curiosity, said that “Motorola will not sell more than 1 million cellular phones.”
Today, of course, many of the more than 2 billion cellular subscribers worldwide are upset when they can’t get sufficient download speeds from the Internet to watch streaming video. Wireless communications—voice, data, and now video transport—has become an “anytime, anyplace” service. We no longer hope for wireless connectivity. We expect it.
Yet the evolution of wireless as a utility comparable with cable, satellite, and broadcast for access and service quality has really just begun. Today’s wireless operators focus on specific applications and capabilities—targeting customer types, providing national and international access, and using different combinations of transport technologies tailored to provide individualized services.
Two fundamental technologies, digital communications and integrated signal processing, lie at the core of wireless communications’ evolution into the vision of “ubiquity” voiced by Qualcomm nearly 25 years ago. These two technologies are the foundation for the future viability of high-quality anytime, anyplace wireless.
The ever-growing demand for wireless bandwidth, driven by the integration of Internet access, increased voice capacity, and quality of service, has rapidly driven wireless transport technologies from analog FM into the complex domain of digital signaling. In addition, Internet access has shifted cellular technology from balanced uplink and downlink data rates to a downlink-heavy model.
One result of this change is the current battle for broadband wireless supremacy between a new generation of networks optimized for data transport based on 802.16e (mobile WiMAX) and the next-generation cellular standard known as LTE.
Interestingly enough, both LTE and mobile WiMAX use OFDMA in the device downlink, which optimizes spectral efficiency, enables high-capacity downloads from the Internet, and, as a result, creates a critical path for the evolution of broadband wireless services to the status of “utility.”
OFDMA lets system designers take advantage of state-of-the-art high-data-rate baseband technology developed for wireless local-area networks, or WLANs (today’s dominant 802.11g/n signaling technology, which also uses OFDMA). Improvements in signal processing power, physical footprint, and baseband power consumption (including its impact on device battery life) have enabled the levels of broadband performance that consumers have come to expect from their existing mobile devices.
LTE and WiMAX differ in their uplink multiple access approach, with WiMAX continuing to use OFDMA for the uplink to provide maximum data rates. LTE adopts a single-carrier approach, called SC-FDMA, that suits the evolution of cellular networks in their effort to provide voice and data services. The crux of this decision by LTE designers lies with the technical challenges that broadband wireless presents to the device uplink’s RF physical layer.
Uplink and the RF Physical Layer
The most popular cellular phone in the early 1990s was Motorola’s StarTac. Its clamshell design defined a fundamental advance in phone portability. And, unfortunately, it could burn your ear in a call that lasted longer than 10 minutes. The excessive heat was a result of the low operating efficiency of the RF PA. In fact, transmit RF PA performance has been one of the most longstanding challenges for mobile device design.
The operating efficiency of the RF PA essentially defined mobile-phone battery life until the introduction of the GaAs HBT in the late 1990s. The GaAs HBT PA eventually elevated PA efficiency into the 60% range, meaning the device converts 60% of the energy it consumes into RF energy, with the other 40% converted into heat. This advance enabled the 8 to10 hours of talk time now common in GSM/GPRS 2G mobile phones.
EDGE and WCDMA, on the other hand, introduced other technical challenges for mobile-phone PAs. Both of these mobility standards put a premium on optimizing the tradeoff between ACP, transmit power, and operating efficiency. As a result, today’s 3G phones typically feature three hours of talk time for WCDMA and four hours for GSM/GPRS/EDGE, which is in large part due to increasingly complex RF power amplifiers.
Want to use this article? Click here for options!
© 2013 Penton Media Inc.
Acceptable Use Policy blog comments powered by Disqus
Most Popular Stories
CTIA Wireless IT & Entertainment 2010
Read the latest from the show...