Posted by Cary Chin on August 9, 2012
This blog originally posted on the Low Power Engineering Community 5/10/12.
My goal this month was to determine the power efficiency of cellular communications on the new iPad—specifically to determine any difference in power efficiency between communications in 4G LTE vs. 3G.
We saw last month that the new 4G LTE data speed was extremely impressive. I’ve now seen speeds of up to 20 Mbps using 4G LTE for both downloads and uploads, many times faster than I’ve seen on 3G networks through any carrier, and even faster than my home internet connection. But there’s usually a price for that level of performance, and the question I’ve been trying to answer is, what is the cost for all that speed in terms of nWh/B (nanoWatt hours per byte)?
While this question seems fairly straightforward to answer, I ran into a number of interesting problems trying to take this month’s measurements. First of all, all the data I’ve taken over the last couple of years depends on running repeatable experiments, and comparing multiple runs to determine the energy usage of controlled variables such as the display, wireless communications, sound, and other areas of interest. My first problem this month as I started to look at 4G LTE communications was that my test vehicle (the 2009 Star Trek movie), is no longer available for streaming via Netflix! Netflix has been having its own problems in the last year and has not renewed agreements with several of its movie content providers, and Star Trek was one of the casualties. Unfortunately for me, that was my standard movie test for gauging energy consumption of movie streaming versus local playing. So my first task was to recreate my testing setup for movie streaming.
After looking at several possibilities including other online content providers, cable options and even home network sharing, I decided to pursue a path using the cloud. My new setup is using Dropbox for cloud storage, and I’ve uploaded several copies of Star Trek with varying resolutions (and file sizes). My locally played version of the movie is 1.96GB, which is too large to stream. In fact, that’s my entire monthly allocation of data! So I’ve created a 1GB version to use as my “high-resolution” version, and several other smaller versions for comparison.
I’ve just started collecting data, but my first data points are already interesting. Recall last month that playing Star Trek (2:06:46) on my new iPad with the display and sound turned all the way down used about 2.1Wh of energy. Turning up the display to maximum brightness increased the energy consumption to a whopping 11.5Wh or around 9.4Wh (or 4.5W) just for the retina display backlighting! Adding sound increased energy consumption to 11.9Wh.
Now with my new streaming setup, streaming my 1GB version of the movie (which looked GREAT by the way) over 4G LTE, energy consumption is up to around 17.4Wh, adding another 5.5Wh of energy for the wireless transfer of the 1GB of data. And that’s just my first data point. We saw previously in experiments on 3G networks that low signal quality could increase energy consumption of the data transfer by 2x-3x. So 5.5Wh of energy for 1GB of data, we now have our first data point on data transfer energy efficiency in 4G LTE. For video streaming, efficiency has come in at 5.5 nWh/B.
While all of the data isn’t in yet, we can see that fast communication does certainly require significant energy. But maybe even more importantly, it’s clear that the latest generation of cellular data communications is another game-changer. Data transfer speed is clearly no longer my biggest problem for wireless communication. The real issue is going to be cost. Current cellular data plans average between $10/GB and $20/GB, so streaming my 1GB version of Star Trek over a cellular network may look great, but the data transfer costs more than buying a ticket to the theatre—well, at least the popcorn is cheaper. The bottlenecks are moving (or maybe just rotating) again!