For end users, battery life is still one of the key metrics when selecting a handset. This is why, as more is asked of the battery in mobile handsets, manufacturers have been so keen to pursue any strategy that reduces current consumption. As a result, handset manufacturers have looked to minimise the peak handset transmit power wherever possible.
Ironically the fact is that, in the data-centric 4G world, far from extending battery life, backing off handset transmit power as far as possible is actually increasing the drain on batteries. Not only that, but "half power handsets" also significantly reduce network coverage and data rates for all users.
A new approach is needed to overcome these significant performance limitations and consign half-power handsets to history. It may seem counter-intuitive, but the best way to extend handset battery life is deceptively simple - turn the transmit power up to eleven.
Low power, mo' problems
In the voice-centric 2G and 3G era, the strategy of backing off transmit power made good sense and fitted neatly with network bandwidth allocation strategies. Network bandwidth allocation in those networks tended to focus on two key metrics: number of simultaneous users, and latency for voice calls.
Fig 1 A handset power consumption comparison between 2G, 3G and 4G transmissions.
Schedulers in the base station tended to maximise network 'size' by allocating the minimum amount of bandwidth to each user, increasing the number of simultaneous users, and limiting peak allocations. This low-bandwidth, low-power approach has one major benefit - minimising latency for each user, which was historically important for voice calls.
The transition to 4G has resulted in a significant change to the network, with dynamic bandwidth allocation on a per-timeslot basis, which is much better suited to 'bursty' data requirements. This allows the base station to allocate almost the entire channel to a single handset for a single timeslot, maximising the instantaneous data rate.
However, in LTE the transmit power from the handset is directly proportional to the number of Resource Blocks allocated in the frequency domain, with up to 20 dB (100x) difference between the low-data-rate voice and high-rate bursts of data. This pushes up both the average and peak transmit power of the RF Power Amplifier (PA) in the handset. For the highest data rates, the handset is also able to use highly efficient 16QAM transmissions, which push the peak power up by a further 1 dB (25%).
As a result, many handset manufacturers and chipset vendors have released 4G products that fall way short of the 3GPP specifications for transmit power, claiming that it's not possible for them to transmit at full power.
FCC reports show how today's LTE handsets and terminals are often incapable of transmitting at full power, falling short by 2.5 dB or more even in the relatively easy 700-MHz bands. Manufacturers have taken advantage of specification loopholes that allow Maximum Power Reduction (MPR) of up to 3 dB for high-bandwidth LTE transmissions – just half the specified output power.
Table 1 An example FCC report showing how handset transmit power is being backed off by manufacturers.
This may make life easier for the RF front-end designer, but halving the peak transmit power requires handsets to transmit for twice as long for a given data payload - significantly degrading battery life. During those extra timeslots, the power-hungry LTE modem and apps processor have to stay awake, burning even more power.
More than that, it also hogs precious network resources, halving network throughput. It also means that more often than not, LTE network coverage is limited by the handset's transmit performance rather than the basestation's. Nujira's own network modelling has suggested that coverage can be reduced by as much as 32% as a result of half-power handsets.
Next page - the benefits of 'louder, faster'...