Posted by Cary Chin on August 9, 2012
This blog originally posted on the Low Power Engineering Community 6/14/12.
In my ongoing quest to understand power efficiency at the product level of our amazing gadgets, I’ve been trying to set up experiments to compare 4G LTE and 3G data transmission. It seems to make intuitive sense that transferring data faster (my Speed Test runs show Verizon 4G LTE speed up to 20Mbps, vs. a maximum of around 700Kbps in 3G in and around Palo Alto) should consume more energy, and there’s certainly plenty of “Internet intelligence” out there claiming that “4G is a battery hog!!” But as we engineers know, these exclamations without any data don’t really mean anything.
First of all, a slight correction to my posting last month. I had gathered data showing a difference of 5.5Wh between watching a local copy of Star Trek and streaming a 1GB version through Dropbox, resulting in my first “energy efficiency of data transfer” number of 5.5nWh/B (nano-Watt-hours per Byte). It turns out I wasn’t that careful about maintaining the proper number of significant figures. The actual streamed movie file was 1.1GB, so the correct result was actually 5.0nWh/B; a small but significant difference.
The other observation I made last month was that streaming an approximately 1GB movie repeatedly was going to be a problem on several fronts. First, my account on Dropbox was frozen due to “excessive activity” after a few tests. And the size of the file required a sustained data transfer rate of around 1.2Mbps, which ruled out any comparative testing on the Verizon 3G network (at least in my neighborhood.) But the biggest problem was that current wireless data plans charge anywhere between $10 and $20 per GB of data, so each test data point was costing me about $30 (I’m still repeating all tests for verification). So I’ve created a much smaller version of the movie for streaming tests. This one scales things down to 240×112 pixel resolution, with a total file size of around 138MB. It looks surprisingly good in the iPhone/iPod form factor, but it’s definitely “chunky” (but watchable) on any iPad or computer. Quality is roughly comparable to what I used to see streaming from Netflix, especially with a slow network or in areas of poor coverage.
Using this setup, I’ve gathered several interesting data points. From our previous experimental data, the display backlighting itself required about 9.4Wh of energy to watch the movie, and sound required another 0.4Wh, for a total of 9.8Wh. Streaming the entire movie in 4G LTE mode with full brightness and sound used 14.9Wh of energy, for a delta of 5.1Wh of energy. This energy is used for more than just the direct data transfer, of course, because the local runs could be run in “airplane mode” while the streamed runs obviously could not. Still, that 5.1Wh of energy is surprisingly close to our previous result of 5.5Wh for streaming the 1.1GB version of the movie, suggesting that the amount of data transferred is less important than the elapsed time for the transfer. Running the streaming (138MB) test in 3G mode was even more surprising. It used the “same” (within my measurement limits) amount of energy! 5.1Wh was consumed in either 4G LTE or 3G mode.
To gather data from a different angle, I ran the tests with sound muted and the display brightness at minimum, and got even more interesting data. Again, both the 4G LTE and 3G tests require the same amount of energy, in this case 4.25Wh to stream the 138MB movie. But when I ran the “control case,” just playing back the 138MB version of Star Trek locally in airplane mode, it required the SAME amount of energy again! Now I’m feeling more like I’m watching “Twilight Zone” than “Star Trek”! This is a little hard to explain, but I’m setting up some more experiments to dig a little deeper.
At this point, I can’t say that I’ve even gotten one more data point to compare or refine my initial data point of 5.0nWh/B from last month, but I sure have a lot more interesting questions to pursue. I’ll keep you all posted on any new developments, but a couple of conclusions can be drawn based on the data so far.
The power consumption of the Retina Display is clearly the “long pole” on the new iPad. If you want longer battery life, simply turn down the brightness. The caveat is that I haven’t run any numbers to confirm that the radio communication isn’t an equal factor as we saw on the iPhone 4/4S, but it’s a pretty reasonable hypothesis at this point that power efficiency of communication using 4G LTE isn’t much worse than using 3G (at least not enough to be obviously observable at the product level), so the relative energy consumption of the new display on the iPad will outweigh all other factors.