Posted by Cary Chin on August 9, 2012
This blog originally posted on the Low Power Engineering Community 7/12/12.
After some frustrating inconsistencies in measuring energy efficiency of the new iPad in 3G and 4G LTE modes last month, I decided to start over and take a complete set of data. Although it takes a long time, sometimes it just makes sense to start again. The data I took this month isn’t completely consistent with my past measurements, but it feels like whatever is causing the inconsistencies (my theories range from rogue apps and spyware to battery degradation) between measurement sets, at least the measurements I take over a short span of a week or so remain consistent to within the accuracy of my measurements.
Fundamentally, I’m still using the built-in battery meter, which displays remaining energy to a resolution of 1% of the battery capacity. But that accuracy is generally declining as battery capacity increases, in particular with the new iPad. The other conclusion I’ve come to is that the new iPad isn’t the best vehicle for comparing energy consumption of the 3G and 4G radios because the relative energy drain from communications is independent of the retina display energy draw and the new dramatically higher battery capacity. In fact, the iPhone 5 will be a much better platform for doing the comparison, because the energy usage due to radio communications will be a relatively larger percentage of the battery capacity, increasing our resolution for measurement. Still, the preliminary data on the iPad (below) is interesting.
First, the baseline—this was measured running a compressed version of the Star Trek movie (2:06:46, or almost exactly 2.1 hours), with brightness and sound turned all the way down, and airplane mode on. This is the most energy-efficient configuration of the iPad, and resulted in energy usage of 3.8Wh for the movie, or average power of 1.8 watts. The table shows the energy required in other test conditions.
The Max Brightness case is a measure of energy consumption of the retina display, and just as before, the display really sucks up the energy. I measured 15.3Wh required for the 2.1-hour movie, or average power of 7.3W. That’s 5.5 watts of power difference from lowest to highest setting on the display! As I noted before, if you use your (5W) iPhone charger to charge your new iPad and continue to use it with the display at maximum brightness, you actually will be discharging your battery.
The 4G LTE streamed case showed very little difference in energy consumption. The current theory is that the data transmission time is negligible in this case because the data is small, and transmission speed is very fast. For this application, fast transmission can buffer the data very quickly, so the time that the radio is active is very small compared to the two hours to play the movie. This is the “cheetah mode” case: very short spurts of high energy. It can be very efficient if the work done can be buffered and used over a relatively much longer period of time. Video streaming with a very fast connection is just such a case. Still, ZERO energy (to the limits of accuracy of this test) seems a little too good, so more investigation is warranted.
The 3G radio case shows something very different. Energy balloons to 8.0Wh, for an average power of 3.8W, or 2W just for the 3G radio transmission. Is it possible that the 4G radio consumed a negligible amount or energy for the whole movie, while the 3G radio consumed over 4Wh? Well, the data doesn’t lie. I re-ran the test a couple of times with similar results. What could be going on here? One of the things I noticed about my recent tests is that the radio signal seemed relatively weak. In quite a few cases the streamed version of the movie (frustratingly) failed to complete, and I was even unable to complete some SpeedTest runs. Under these conditions, we’ve seen energy efficiency drop considerably, and that’s my guess as to what’s happening here. In fact, this theory seems pretty consistent with our current data—in the 4G case, fast spurts of data allow a large buffer to be filled quickly while the signal is reasonable. In 3G mode, we’re depending on a constant connection, delivering data slowly and consistently over a long period of time. This might be the worst-case test for a weak radio signal.
Of course, all of this is still just a theory. As usual, we’re not quite done with these measurements! But at least one conclusion is true: We’ve certainly found a counterexample to the general Internet claim that “4G is a power hog.” In fact, in this case, it’s just the opposite!