Posted by Cary Chin on August 9, 2012
Last month we saw some interesting data in our investigation of energy efficiency of 4G LTE transmission versus 3G transmission on the new iPad.
The data above suggests that in 4G LTE mode, the new iPad used roughly the same amount of energy (3.8Wh) for viewing our test movie (Star Trek, 2:06:46, 138MB) locally vs. streaming it from the Internet. In contrast, using the 3G radio, streaming the movie consumes 8.0Wh of energy, a difference of 4.2 Wh. The working theory was that the speed of transmission in 4G mode allowed data to be buffered very quickly, so most of the time is spent just viewing the movie from the local buffer with no data transmission, and with the 4G LTE radio presumably in a low power sleep state. This month, we’ll test that theory.
In order to determine whether our theory makes any sense, we need to understand the difference in transmission speed between the 4G LTE and 3G data transmissions. I took the following data using the Speed Test app, in both 4G LTE and 3G mode, using my Verizon iPad.
First, the data for each mode (columns 2 and 3) shows a spread of about 5x from the slowest data point to the fastest data point (4.12:21.83 for 4G, and 0.2:1.1 for 3G). This is consistent with data we’ve seen previously; transmission speed is a strong function of signal strength, which varies considerably based on terrain, distance, and obstructions to the nearest cell tower. In fact, I was amazed in Hawaii a few weeks ago that I was able to connect to the Internet on my phone while on a sailboat nearly four miles out to sea! With little interference and a direct line of sight, data transmission is obviously very different than in Palo Alto, where known dropouts have existed for years along major roads with little hope of improvement.
The average transmission speed in 4G mode over the 10 data points is 10.62Mbps, and 560Kbps in 3G. Those two differ by a factor of about 20x, and averaging the ratios of data points results in an average ratio of nearly 35x! That’s a pretty serious difference in transmission speed.
To get an idea of the “duty cycle” of transmission as a result of these speed measurements, we’ll look at the two extreme measurements. In the case of transmission at 21.83Mbps, for our movie length of 2:06:46, we can stream our entire 138MB movie in about 50 seconds, compared to a little over 90 minutes at the slowest speed of 200Kbps. So the 4G radio can be operating for as little as 50 seconds to stream the entire 2-hour movie, and the 3G radio could be transmitting for as long as 90 minutes. We haven’t proven exactly what was going on in our tests from last month, but the “cheetah mode” theory is certainly looking good.
At this point, it’s a reasonable possibility that the 3G radio energy consumption of 4.2Wh for the movie vs. zero in 4G could be nearly all explained by the difference in duty cycle for transmission. With a factor of between 20x and 35x (and possible higher) difference in duty cycle (or “ON time), the theorized power in 4G would be about 0.1Wh to 0.2Wh, which is well within the margin of error of our measurements. Recall that our measurements on the new iPad are only accurate to +/- 1% of the battery capacity of 42.5Wh, or +/- 0.42Wh.
So it looks like our measurements COULD make sense after all; in order to confirm, we’ll need a platform with better resolution of battery capacity, in addition to 3G and 4G capabilities. Luckily, the iPhone 5 is just around the corner! In the meantime, a few tests of (very) large file transfers might be very interesting.