Posted by Cary Chin on February 11, 2011
Thanks to my old buddy Dave Castro, I had the opportunity to spend a few hours with the latest iOS device to break sales records – the Verizon iPhone 4. I was excited to try out this device – we found from previous experiments on the AT&T iPhone 4 that the power efficiency of the device varied greatly depending on signal strength – so much so that it eclipsed even the display as the number one energy consumer while streaming Star Trek! The assumption is that with more consistent coverage here in the San Francisco Bay area and around Palo Alto in particular, the Verizon iPhone might finally bring the promise of a more dependable network, and more consistent network performance. Most early tests have shown generally better coverage on Verizon than AT&T, although the same tests have confirmed generally faster network performance on AT&T when you can get a signal. So here’s the moment of truth – will we be able to measure the difference in power efficiency of these two devices relative to signal strength? Does the Verizon iPhone 4 exhibit the similar runaway energy usage when in areas of low signal? Here we go.
First test – I’m going directly for the jugular. Stream Star Trek to the Verizon iPhone 4 in my office, where I can just barely get a signal on my AT&T iPhone 4. This is the case where I wasn’t even able to get through the movie on a full battery charge. The AT&T iPhone 4 extrapolated to 5.3 Wh of energy to play the roughly 2 hours and 6 minutes of Star Trek, with just about 1 bar of signal strength. On Verizon, I got an average 3+ bars of signal strength on average, and streamed the movie with no problem using 3.15 Wh of energy. It’s looking like a landslide victory!
Driving home, I monitored the signal strength on the Verizon iPhone – it varied between 2 and 5 bars, even in my well-travelled and well-known “dead spots” where I’ve dropped many calls and data connections. Another moral victory for Verizon. At this point, I’m about ready to chuck my iPhone back at the AT&T rep, and jump ship. But a little more testing is in order…
Next test is simply to rerun the Star Trek streaming experiment at my house – I chose to do it downstairs, where coverage is typically a little worse. And here’s where it gets strange. Both phones showed about 2 bars of coverage – the Verizon iPhone varied between 2 and 3 bars, and the AT&T iPhone fluctuated more, between 1 and 3 bars. No big surprise yet, but the bottom was about to drop out.
Streaming Star Trek under these conditions, the first big observation was that the picture on the Verizon iPhone was clearly pixilated – it was having trouble getting the streamed data fast enough, particularly in the fast-moving action sequences. The AT&T iPhone showed some similar artifacts, but only occasionally. This continued, and got somewhat worse throughout the movie – the Verizon iPhone stopped playing a dozen or so times during the movie, waiting for the data to catch up. The final results: the Verizon iPhone consumed 4.3 Wh of energy, and the AT&T iPhone consumed 3.15 Wh! No that’s not a typo – the winning number 3.15 Wh, or 60% of the 5.25 Wh battery capacity, was exactly the same as where we started this round of the experiments, except on the other phone! That’s some poetic justice.
It’s now past midnight, and I still have a few more tests to run – it seems our showdown wasn’t nearly as conclusive as I expected. One possible explanation relates back to network coverage vs. network speed. Better coverage is a good thing, but there’s an interesting inversion in equally low coverage areas – network speed might help make up for a poor signal in data intensive applications by allowing the application to buffer more data until the signal quality improves. This, of course, is only a late-night random theory, but an interesting one to pursue. Things just get curiouser and curiouser. If you have any interesting ideas, send them along, and we’ll continue the experiments together.