HOME    COMMUNITY    BLOGS & FORUMS    Absolute Power
Absolute Power
  • About

    Cary Chin is Director of Technical Marketing at Synopsys. His background at Synopsys is in R&D where he has managed the Power Compiler, Primepower, PrimeTime PX, and DC-FPGA products.

    Cary is a member of the Solutions Marketing Group, and focuses on the Synopsys Eclypse Low Power Solution.

4G LTE vs. 3G Power Mystery Explained

Posted by Cary Chin on August 9th, 2012

 

Last month we saw some interesting data in our investigation of energy efficiency of 4G LTE transmission versus 3G transmission on the new iPad.

The data above suggests that in 4G LTE mode, the new iPad used roughly the same amount of energy (3.8Wh) for viewing our test movie (Star Trek, 2:06:46, 138MB) locally vs. streaming it from the Internet. In contrast, using the 3G radio, streaming the movie consumes 8.0Wh of energy, a difference of 4.2 Wh. The working theory was that the speed of transmission in 4G mode allowed data to be buffered very quickly, so most of the time is spent just viewing the movie from the local buffer with no data transmission, and with the 4G LTE radio presumably in a low power sleep state. This month, we’ll test that theory.

In order to determine whether our theory makes any sense, we need to understand the difference in transmission speed between the 4G LTE and 3G data transmissions. I took the following data using the Speed Test app, in both 4G LTE and 3G mode, using my Verizon iPad.

First, the data for each mode (columns 2 and 3) shows a spread of about 5x from the slowest data point to the fastest data point (4.12:21.83 for 4G, and 0.2:1.1 for 3G). This is consistent with data we’ve seen previously; transmission speed is a strong function of signal strength, which varies considerably based on terrain, distance, and obstructions to the nearest cell tower. In fact, I was amazed in Hawaii a few weeks ago that I was able to connect to the Internet on my phone while on a sailboat nearly four miles out to sea! With little interference and a direct line of sight, data transmission is obviously very different than in Palo Alto, where known dropouts have existed for years along major roads with little hope of improvement.

The average transmission speed in 4G mode over the 10 data points is 10.62Mbps, and 560Kbps in 3G. Those two differ by a factor of about 20x, and averaging the ratios of data points results in an average ratio of nearly 35x! That’s a pretty serious difference in transmission speed.

To get an idea of the “duty cycle” of transmission as a result of these speed measurements, we’ll look at the two extreme measurements. In the case of transmission at 21.83Mbps, for our movie length of 2:06:46, we can stream our entire 138MB movie in about 50 seconds, compared to a little over 90 minutes at the slowest speed of 200Kbps. So the 4G radio can be operating for as little as 50 seconds to stream the entire 2-hour movie, and the 3G radio could be transmitting for as long as 90 minutes. We haven’t proven exactly what was going on in our tests from last month, but the “cheetah mode” theory is certainly looking good.

At this point, it’s a reasonable possibility that the 3G radio energy consumption of 4.2Wh for the movie vs. zero in 4G could be nearly all explained by the difference in duty cycle for transmission. With a factor of between 20x and 35x (and possible higher) difference in duty cycle (or “ON time), the theorized power in 4G would be about 0.1Wh to 0.2Wh, which is well within the margin of error of our measurements. Recall that our measurements on the new iPad are only accurate to +/- 1% of the battery capacity of 42.5Wh, or +/- 0.42Wh.

So it looks like our measurements COULD make sense after all; in order to confirm, we’ll need a platform with better resolution of battery capacity, in addition to 3G and 4G capabilities. Luckily, the iPhone 5 is just around the corner! In the meantime, a few tests of (very) large file transfers might be very interesting.

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

iPad: 4G Power Hog?

Posted by Cary Chin on August 9th, 2012

This blog originally posted on the Low Power Engineering Community 7/12/12. http://chipdesignmag.com/lpd/absolute-power

 

After some frustrating inconsistencies in measuring energy efficiency of the new iPad in 3G and 4G LTE modes last month, I decided to start over and take a complete set of data. Although it takes a long time, sometimes it just makes sense to start again. The data I took this month isn’t completely consistent with my past measurements, but it feels like whatever is causing the inconsistencies (my theories range from rogue apps and spyware to battery degradation) between measurement sets, at least the measurements I take over a short span of a week or so remain consistent to within the accuracy of my measurements.

Fundamentally, I’m still using the built-in battery meter, which displays remaining energy to a resolution of 1% of the battery capacity. But that accuracy is generally declining as battery capacity increases, in particular with the new iPad. The other conclusion I’ve come to is that the new iPad isn’t the best vehicle for comparing energy consumption of the 3G and 4G radios because the relative energy drain from communications is independent of the retina display energy draw and the new dramatically higher battery capacity. In fact, the iPhone 5 will be a much better platform for doing the comparison, because the energy usage due to radio communications will be a relatively larger percentage of the battery capacity, increasing our resolution for measurement. Still, the preliminary data on the iPad (below) is interesting.

First, the baseline—this was measured running a compressed version of the Star Trek movie (2:06:46, or almost exactly 2.1 hours), with brightness and sound turned all the way down, and airplane mode on. This is the most energy-efficient configuration of the iPad, and resulted in energy usage of 3.8Wh for the movie, or average power of 1.8 watts. The table shows the energy required in other test conditions.

The Max Brightness case is a measure of energy consumption of the retina display, and just as before, the display really sucks up the energy. I measured 15.3Wh required for the 2.1-hour movie, or average power of 7.3W. That’s 5.5 watts of power difference from lowest to highest setting on the display! As I noted before, if you use your (5W) iPhone charger to charge your new iPad and continue to use it with the display at maximum brightness, you actually will be discharging your battery.

The 4G LTE streamed case showed very little difference in energy consumption. The current theory is that the data transmission time is negligible in this case because the data is small, and transmission speed is very fast. For this application, fast transmission can buffer the data very quickly, so the time that the radio is active is very small compared to the two hours to play the movie. This is the “cheetah mode” case: very short spurts of high energy. It can be very efficient if the work done can be buffered and used over a relatively much longer period of time. Video streaming with a very fast connection is just such a case. Still, ZERO energy (to the limits of accuracy of this test) seems a little too good, so more investigation is warranted.

The 3G radio case shows something very different. Energy balloons to 8.0Wh, for an average power of 3.8W, or 2W just for the 3G radio transmission. Is it possible that the 4G radio consumed a negligible amount or energy for the whole movie, while the 3G radio consumed over 4Wh? Well, the data doesn’t lie. I re-ran the test a couple of times with similar results. What could be going on here? One of the things I noticed about my recent tests is that the radio signal seemed relatively weak. In quite a few cases the streamed version of the movie (frustratingly) failed to complete, and I was even unable to complete some SpeedTest runs. Under these conditions, we’ve seen energy efficiency drop considerably, and that’s my guess as to what’s happening here. In fact, this theory seems pretty consistent with our current data—in the 4G case, fast spurts of data allow a large buffer to be filled quickly while the signal is reasonable. In 3G mode, we’re depending on a constant connection, delivering data slowly and consistently over a long period of time. This might be the worst-case test for a weak radio signal.

Of course, all of this is still just a theory. As usual, we’re not quite done with these measurements! But at least one conclusion is true: We’ve certainly found a counterexample to the general Internet claim that “4G is a power hog.” In fact, in this case, it’s just the opposite!

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

iPad Power Mysteries

Posted by Cary Chin on August 9th, 2012

This blog originally posted on the Low Power Engineering Community 6/14/12. http://chipdesignmag.com/lpd/absolute-power

 

In my ongoing quest to understand power efficiency at the product level of our amazing gadgets, I’ve been trying to set up experiments to compare 4G LTE and 3G data transmission. It seems to make intuitive sense that transferring data faster (my Speed Test runs show Verizon 4G LTE speed up to 20Mbps, vs. a maximum of around 700Kbps in 3G in and around Palo Alto) should consume more energy, and there’s certainly plenty of “Internet intelligence” out there claiming that “4G is a battery hog!!” But as we engineers know, these exclamations without any data don’t really mean anything.

First of all, a slight correction to my posting last month. I had gathered data showing a difference of 5.5Wh between watching a local copy of Star Trek and streaming a 1GB version through Dropbox, resulting in my first “energy efficiency of data transfer” number of 5.5nWh/B (nano-Watt-hours per Byte). It turns out I wasn’t that careful about maintaining the proper number of significant figures. The actual streamed movie file was 1.1GB, so the correct result was actually 5.0nWh/B; a small but significant difference.

The other observation I made last month was that streaming an approximately 1GB movie repeatedly was going to be a problem on several fronts. First, my account on Dropbox was frozen due to “excessive activity” after a few tests. And the size of the file required a sustained data transfer rate of around 1.2Mbps, which ruled out any comparative testing on the Verizon 3G network (at least in my neighborhood.) But the biggest problem was that current wireless data plans charge anywhere between $10 and $20 per GB of data, so each test data point was costing me about $30 (I’m still repeating all tests for verification). So I’ve created a much smaller version of the movie for streaming tests. This one scales things down to 240×112 pixel resolution, with a total file size of around 138MB. It looks surprisingly good in the iPhone/iPod form factor, but it’s definitely “chunky” (but watchable) on any iPad or computer. Quality is roughly comparable to what I used to see streaming from Netflix, especially with a slow network or in areas of poor coverage.

Using this setup, I’ve gathered several interesting data points. From our previous experimental data, the display backlighting itself required about 9.4Wh of energy to watch the movie, and sound required another 0.4Wh, for a total of 9.8Wh. Streaming the entire movie in 4G LTE mode with full brightness and sound used 14.9Wh of energy, for a delta of 5.1Wh of energy. This energy is used for more than just the direct data transfer, of course, because the local runs could be run in “airplane mode” while the streamed runs obviously could not. Still, that 5.1Wh of energy is surprisingly close to our previous result of 5.5Wh for streaming the 1.1GB version of the movie, suggesting that the amount of data transferred is less important than the elapsed time for the transfer. Running the streaming (138MB) test in 3G mode was even more surprising. It used the “same” (within my measurement limits) amount of energy! 5.1Wh was consumed in either 4G LTE or 3G mode.

To gather data from a different angle, I ran the tests with sound muted and the display brightness at minimum, and got even more interesting data. Again, both the 4G LTE and 3G tests require the same amount of energy, in this case 4.25Wh to stream the 138MB movie. But when I ran the “control case,” just playing back the 138MB version of Star Trek locally in airplane mode, it required the SAME amount of energy again! Now I’m feeling more like I’m watching “Twilight Zone” than “Star Trek”! This is a little hard to explain, but I’m setting up some more experiments to dig a little deeper.

At this point, I can’t say that I’ve even gotten one more data point to compare or refine my initial data point of 5.0nWh/B from last month, but I sure have a lot more interesting questions to pursue. I’ll keep you all posted on any new developments, but a couple of conclusions can be drawn based on the data so far.

The power consumption of the Retina Display is clearly the “long pole” on the new iPad. If you want longer battery life, simply turn down the brightness. The caveat is that I haven’t run any numbers to confirm that the radio communication isn’t an equal factor as we saw on the iPhone 4/4S, but it’s a pretty reasonable hypothesis at this point that power efficiency of communication using 4G LTE isn’t much worse than using 3G (at least not enough to be obviously observable at the product level), so the relative energy consumption of the new display on the iPad will outweigh all other factors.

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

4G LTE vs. 3G on the new iPad

Posted by Cary Chin on August 9th, 2012

This blog originally posted on the Low Power Engineering Community 5/10/12. http://chipdesignmag.com/lpd/absolute-power

 

My goal this month was to determine the power efficiency of cellular communications on the new iPad—specifically to determine any difference in power efficiency between communications in 4G LTE vs. 3G.

We saw last month that the new 4G LTE data speed was extremely impressive. I’ve now seen speeds of up to 20 Mbps using 4G LTE for both downloads and uploads, many times faster than I’ve seen on 3G networks through any carrier, and even faster than my home internet connection. But there’s usually a price for that level of performance, and the question I’ve been trying to answer is, what is the cost for all that speed in terms of nWh/B (nanoWatt hours per byte)?

While this question seems fairly straightforward to answer, I ran into a number of interesting problems trying to take this month’s measurements. First of all, all the data I’ve taken over the last couple of years depends on running repeatable experiments, and comparing multiple runs to determine the energy usage of controlled variables such as the display, wireless communications, sound, and other areas of interest. My first problem this month as I started to look at 4G LTE communications was that my test vehicle (the 2009 Star Trek movie), is no longer available for streaming via Netflix! Netflix has been having its own problems in the last year and has not renewed agreements with several of its movie content providers, and Star Trek was one of the casualties. Unfortunately for me, that was my standard movie test for gauging energy consumption of movie streaming versus local playing. So my first task was to recreate my testing setup for movie streaming.

After looking at several possibilities including other online content providers, cable options and even home network sharing, I decided to pursue a path using the cloud. My new setup is using Dropbox for cloud storage, and I’ve uploaded several copies of Star Trek with varying resolutions (and file sizes). My locally played version of the movie is 1.96GB, which is too large to stream. In fact, that’s my entire monthly allocation of data! So I’ve created a 1GB version to use as my “high-resolution” version, and several other smaller versions for comparison.

I’ve just started collecting data, but my first data points are already interesting. Recall last month that playing Star Trek (2:06:46) on my new iPad with the display and sound turned all the way down used about 2.1Wh of energy. Turning up the display to maximum brightness increased the energy consumption to a whopping 11.5Wh or around 9.4Wh (or 4.5W) just for the retina display backlighting! Adding sound increased energy consumption to 11.9Wh.

Now with my new streaming setup, streaming my 1GB version of the movie (which looked GREAT by the way) over 4G LTE, energy consumption is up to around 17.4Wh, adding another 5.5Wh of energy for the wireless transfer of the 1GB of data. And that’s just my first data point. We saw previously in experiments on 3G networks that low signal quality could increase energy consumption of the data transfer by 2x-3x. So 5.5Wh of energy for 1GB of data, we now have our first data point on data transfer energy efficiency in 4G LTE. For video streaming, efficiency has come in at 5.5 nWh/B.

While all of the data isn’t in yet, we can see that fast communication does certainly require significant energy. But maybe even more importantly, it’s clear that the latest generation of cellular data communications is another game-changer. Data transfer speed is clearly no longer my biggest problem for wireless communication. The real issue is going to be cost. Current cellular data plans average between $10/GB and $20/GB, so streaming my 1GB version of Star Trek over a cellular network may look great, but the data transfer costs more than buying a ticket to the theatre—well, at least the popcorn is cheaper. The bottlenecks are moving (or maybe just rotating) again!

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

Secrets of the iPad 3

Posted by Cary Chin on August 9th, 2012

This blog originally posted on the Low Power Engineering Community 4/5/12. http://chipdesignmag.com/lpd/absolute-power

I had pretty much decided not to purchase the new iPad (aka “iPad 3”) because I don’t use my first two iPads as much as I could. They are just something extra to carry around, and they don’t quite replace a laptop when I need one.

Of the rumored new features on the new iPad (Retina display, A5X processor and quad-core GPU, 5MP iSight camera, and 4G LTE), the only one that seemed significant for my use was the 4G communications, and that wasn’t quite enough to push me into those opening day lines. However, I’ve changed my mind, and am now playing with my new iPad. And the feature that pushed me over the edge only showed up at the last minute on the new feature rumor list—a significantly bigger battery!

Battery capacity and charging mysteries explained
Battery capacity should really be the headline for the new iPad. Battery capacity jumps 70% to a whopping 42.5wH! That’s higher capacity than the 11” MacBook Air. The overall device is only marginally thicker and heavier than the iPad 2, but a 70% increase in battery capacity? The rumors were that Apple had come up with some magical new battery chemistry. Could it be that the rest of the world had fallen that far behind? Well, a closer analysis shows again that there is no magic. The new battery capacity of the iPad is very much in line with the size (and weight) increase of the new battery pack.

The weight of the battery pack has increased 54%, and the total volume of the pack has gone up by 77%, so the 70% increase in capacity is in the right ballpark. The amazing thing is how the overall product is put together with the same length and width dimensions and only a 0.6mm increase in depth, plus about 50 grams in net weight. With a net product-level increase of around 7% in size and weight, battery capacity has increased by 70%. The “magicians” in Cupertino have once again shocked the world with the same technology that everyone else has access to, but they designed it better.

The other chatter on the Internet concerns battery charging. There are many complaints about slow charging of the new iPad. This is no surprise, because while the battery capacity has increased by 70%, the charge rate hasn’t changed. Charging a 42.5Wh battery with a 10W charger will take a theoretical 4.25 hours, assuming 100% efficient charging. In reality, expect five- to six-hour hour charge times. Using a 5W (iPhone) charger or a computer USB port will take additional time, which is inversely proportional to the current supplied by the port. And all of this assumes the iPad is off (or at least not in use) during the charging. The answer to your charging problems: Use the included 10W adapter, and charge overnight.

Retina display—good, bad, but not ugly
It’s good that the battery capacity has increased dramatically, because while the retina display looks great it’s a definite power hog. I ran my baseline “Star Trek” movie test on the new iPad, and basic power efficiency looked pretty good. Playing the entire movie in the “Max Battery” mode (airplane mode on, display and sound at minimum) required just 2.13Wh of energy. That’s even better than the iPad 2 at 2.25Wh. The basic hardware is getting more powerful, but energy efficiency is increasing even faster. Isn’t technology wonderful?

Once the brightness was turned up the display looked fabulous, especially displaying high-resolution pictures and computer-generated graphics. My movie test didn’t look much better on the new iPad vs. my iPad2 though. It’s clear you’ll need higher definition (at least 1080p) video to see much improvement. And the news gets worse.

Given the baseline energy measurement, we can get a pretty good estimate on how much that beautiful display is going to cost in energy. Turning up the brightness to maximum on “Star Trek” increased the energy consumption to 11.48Wh! That means the additional energy needed just to run the display between minimum brightness and maximum brightness is 9.35Wh for the Star Trek movie (2:06:46), and power consumption just for the display is around 4.45 watts. Imagine using a 5W charger to try to charge this device. Turning on the display at max brightness immediately makes it impossible to charge the battery. All of the energy is used up just to keep that display nice and bright. The charging time issues will continue to swirl in the user community for some time, and have a real impact on how the device can be used.

I believe this is a potential design flaw in the new iPad. What’s the point of a big, beautiful Retina display, if practicality forces you to turn down the brightness to the minimum usable level? And we’re just starting to see the other impact of viewing high-resolution video—gobs of data. Getting the most out of the Retina display will put many additional strains on the product, from device storage limitations to the cost of data plans.

From my standpoint, the Retina display, while beautiful, isn’t worth the high energy (and data) cost. I can’t see the individual pixels on the new iPad, but to tell you the truth, I can’t see them iPad 2 either, without high-powered reading glasses. Is it possible that Apple has lost sight of the fundamental design principle that “form follows function”? Knowing what I know now, and had there been the option, I would have ordered a custom new iPad with the old display, and been very happy with the 70% larger battery.

The killer app
While these issues will keep the blogosphere busy, there’s another application of my new iPad that is a clear winner. I got the Verizon 4G LTE iPad, and the communications speed is very impressive. I’m seeing network speeds up to 13 Mbps on LTE, compared to 700 Kbps in 3G. These bracket the network speeds of up to 4 Mbps I get on my iPhone 4S on the AT&T network (you know, the 4G network formerly known as 3G). With much better coverage in my area, Verizon 4GLTE is looking like a winner, and I really can’t wait for the iPhone 5!

On top of that, the Verizon personal hotspot feature is included with the service plan on the iPad, so for the same $20 I used to spend just to enable the personal hotspot on my iPhone, I get expanded coverage across both AT&T (iPhone) and Verizon (iPad) networks—plus a faster network connection to share. And with the iPad’s new larger battery, the killer app for me is using my new iPad as a WiFi hotspot. I was traveling last week, and two of us used the iPad as a WiFi hotspot for more than 12 hours. That’s more than a full day’s work, transferring a total of 297 MB of data in email and Web browsing. At the end of the 12 hours, the battery capacity was still at 57%! That’s incredible! And the irony is that this best and most impressive usage of the new iPad was run with the fancy new Retina display OFF.

Next time, I’ll look at power efficiency of the 4G LTE network vs. 3G. All that speed has got to cost something, right??

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

iOS 5 Power Problems

Posted by Cary Chin on August 9th, 2012

This blog originally posted on the Low Power Engineering Community 3/8/12. http://chipdesignmag.com/lpd/absolute-power

Since the release of iOS 5 along with the iPhone 4S back in October, we’ve finally seen the conclusion of ”antennagate” on the iPhone 4, only to be quickly replaced by “batterygate” as the new biggest complaint of iOS 5 users.

I’ve personally experienced some of the problems reported throughout the user community. iPhones and iPads seem to “hang” while trying to connect or communicate, and the result is battery drain of more than 1% per minute (on iPhone 4S), similar to the power used when streaming a video over a weak 3G connection. I’ve seen this problem occur while connected via wifi as well, with similar results—a warm device and a drained battery within an hour or so.

The common denominator seems to be iOS 5.0, which was patched less than a month after its initial release to version 5.0.1. The top billing on that update read, “Fixes bugs affecting battery life.” But my problems haven’t gone away, suggesting that there were quite a few of these “bugs” floating around in the original 5.0 release. I’ve gotten used to powering down my phone occasionally, which for me has been the only consistent way of resetting it deeply enough to correct the problem. I’ve tried to isolate and reproduce the problem, but haven’t had any luck.

Today will be the rumored announcement of the iPad 3, and if you believe the rumor mill, the long awaited iOS 5.1 update, which again is said to fix the “batterygate” problem. As we’ve discussed frequently, the complexity of today’s smartphones is hard to comprehend, especially because the entire hardware platform and software stack continue to evolve and expand rapidly. Identifying these kinds of problems is extremely difficult, but in our new reality of social networking, information from millions of users can be pooled together to help isolate problems as a community. This kind of “social debugging” is fundamentally changing the way we think about software development. And in the grand scheme of things, life is just one big beta test anyway, right?

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

Solar smartphones

Posted by Cary Chin on August 9th, 2012

 

This blog originally posted on the Low Power Engineering Community 2/9/12. http://chipdesignmag.com/lpd/absolute-power

 

One of the more intriguing smartphone features that hasn’t yet reached the final product is the possibility of a solar cell embedded in the screen that would produce enough energy to power the device indefinitely (at least in the sun).

Rumored via a patent application by Apple several years ago, with a recent prototype by French company Wysips, and even by reclaiming “wasted” light (http://www.phonescoop.com/articles/article.php?a=9703 ) on OLED displays, the days where your smartphone battery needs to be plugged in to be charged may finally be coming to an end. It’s certainly not an unfamiliar scenario. Calculators are just smartphones without the phone and a bunch of other stuff, and they’ve had this mode of operation for more than 30 years.

Current technology is targeting about 30 minutes of talk time in 1 hour of charging, which seems pretty reasonable. Of course, today’s smartphone functions go way beyond just talking on the phone. “Talk time” is dominated by the energy cost of cellular transmission, but the display is typically off and other data functions are inactive. That one hour of charging might translate into 10 minutes of streaming a video, or 5 minutes of interactive online gaming. On today’s iPhone 4S, most usage reports suggest around 9 hours of talk time on the 5300 mWh battery, working out to around 600 mW of power required to operate the phone function. That would put the target at around 300mW for a hypothetical iPhone 4S solar cell, or with the 2”x3” screen size, about 50 mW per square inch.

On a larger scale, I’ve recently installed solar panels on my house, composed of 65”x39” Yingli Solar panels of 235W each, which works out about 105 mW per square inch. I’m a little surprised that the efficiency of the large (inexpensive) panels seems to be twice as high as the target for a compact solar smartphone charger (probably because the panels on my roof are more than two inches thick, or five times as thick as the whole phone), but at least the target doesn’t seem out of reach. In another generation or two, you’ll be able to talk on the phone all day without draining the battery appreciably, assuming you’re not also using the on-the-fly foreign language translation option or grammar correction module at the same time.

So while it seems like we’re still a little ways away from having smartphone-class devices that will power themselves indefinitely, it’s not too early for the entrepreneur’s tip of the month: CLEAR smartphone cases…!

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

iPhone Antennagate Solved

Posted by Cary Chin on August 9th, 2012

 

This blog originally posted on the Low Power Engineering Community 1/12/12. http://chipdesignmag.com/lpd/absolute-power

 

Last time we looked at the relative signal strengths on the iPhone 4S vs. the iPhone 4 using field test mode to gather data in a variety of locations around Silicon Valley. The data below shows signal strengths for the 4S and 4, with and without using the “death grip” on the phone.

We saw slightly better reception on the iPhone 4S than on the iPhone 4, but the really dramatic difference was the delta in signal strength with the “death grip” applied. The iPhone 4S showed a signal strength attenuation of around -6dBm which is significant, but the iPhone 4 showed a huge drop of -17dBm, more than enough to cause dropped data and voice connections.

The next question is, what’s the relative performance of the two smart phones in our video streaming test? We’ve run quite a few tests already, so here’s a summary from the archives:

We have a couple of comparable data points. The data suggests that not only does the iPhone 4S fix the death grip problem (from the graph above) as a result of the new dual-antenna design, but it also seems to have markedly better power efficiency in communications. That’s likely due to the new and improved modem chip. Comparing the baseline numbers with the “2 to 3 bar” data, the 4S requires only an additional 500 mWh of energy to stream the movie vs. watching it locally, compared to 1500 mWh on the iPhone 4. That’s a pretty significant improvement. Plus, we don’t see the runaway energy consumption problem at very low signal strengths as was the case in our original iPhone 4 tests. This one’s a little harder to draw a conclusion, since the test conditions don’t match up too well. I’m very interested in looking at iPhone 4S performance on the Star Trek test at the edge of its signal reception capability.

It’s time to go locate another good testing spot, and watch the movie a few more times. I’ll report back next time!

 

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

Death Grip

Posted by Cary Chin on August 9th, 2012

 

This blog originally posted on the Low Power Engineering Community 12/1/11. http://chipdesignmag.com/lpd/absolute-power


One of the areas I’ve been most interested in testing out on the new iPhone 4S is the effectiveness of the new dual antenna design, which is claimed to solve the now-infamous iPhone 4 “death grip” problem. Since we’ve already seen in previous experiments that battery life on these devices is strongly tied to communications and signal strength, any improvement in reception would have a direct impact on battery life whenever the radio is being used extensively. And with the addition of Siri, expanded notifications, and iCloud in iOS 5, it’s clear that the percentage of time our coveted smart phones will be spending connected (or connecting) to data networks is going to be increasing.

To get a picture of the effectiveness of the new antenna setup, I took data using the built-in Field Test mode (key in the string ‘*3001#12345#*’ without the beginning and ending apostrophes into the Phone application and then push “call”), which conveniently displays signal strength in dBm (power ratio in dB referenced to 1 mW). I compared the signal strength on the iPhone 4S with the iPhone 4, and also recorded signal strength for each device while using the “death grip” that Steve Jobs warned us about. It’s not as glamorous as the “Vulcan death grip” that Mr. Spock administered to Captain Kirk in “The Enterprise Incident,” but everyone knows there’s no such thing as a “Vulcan death grip” anyway!

The data chart below shows average signal strength data in 9 different locations, from “terrible signal” (1 bar or less) to “strong signal” (5 bars). Note, as we’ve mentioned before, signal strength for these kinds of transmissions is notoriously flaky. The variance in readings in each location over a period of around two minutes was between -4 dBm and -10 dBm. The “terrible signal” location was in my office in Mountain View. And just for grins, I’ll bet you can’t guess where, within a five-mile radius of Mountain View, you can reliably get a very “strong signal.” If you guessed a street address of “One Infinite Loop,” you’d be correct!

The top line (green) is the data for the new iPhone 4S. It signal reception is generally a couple of dBm better than the original iPhone 4 (orange). This is a measurable difference, but practically not too significant, especially given the large variations in the measurements. However, there is a big difference when comparing the signal strength of the two devices while applying the “death grip” (light green and light orange). The iPhone 4S shows a drop of less than -6 dBm on average using the “death grip”, but the original iPhone 4 shows a whopping drop of over -17 dBm, and that’s conservative because the Field Test numbers were “pegged” at -116 dBm. This would definitely result in dropped calls in any areas without a very strong signal to begin with.

So, the iPhone 4S does indeed solve the “death grip” problem. Its dynamically switching dual-antenna design (one on the bottom and another on top) is an effective solution that we’re sure to see in other smart phones as well. Just for the fun of it, I also applied a “double death grip” to the iPhone 4S, and saw about an additional -6 dBm of signal strength loss as a result. But don’t try this one at home (or at least not in public). It’s legitimately in Apple’s category of “you’re not supposed to hold the phone that way!”

Next time; we’ll try to use the data we’ve gathered to predict and measure the improvement in energy efficiency of the iPhone 4S over the iPhone 4 as a result of the new antenna design. Stay tuned.

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | Comments Off

iPhone 4S Power Efficiency Improvements

Posted by Cary Chin on January 27th, 2012

This blog originally posted on the Low Power Engineering Community 11/3/11. http://chipdesignmag.com/lpd/absolute-power

I’ve had my iPhone 4S for a few weeks now, and have gotten to know it pretty well. Mine is the 64GB model, mostly because I want to play around with the new 1080p video recording capabilities, and don’t want to worry about running out of storage space all the time.

I’ve read many reports online about runaway battery consumption with the new 4S, but I haven’t noticed any huge change vs. my iPhone 4 in battery behavior. I still charge it every night and most of the time when I’m at my desk in my office.

From a hardware perspective, the iPhone 4S pretty much delivers as expected. The dual-core A5 processor zips along with plenty of headroom, smoothing out many of the rough spots in everyday usage that have started to creep in since the iOS 4.3 update. The camera upgrade boosts both still image and video (1080p) performance into the realm of most modern point-and-shoot cameras, although the lack of any optical zoom is still a big limitation. On the other hand, the wide availability of camera and photo enhancement apps pretty much make up for the lack of optical zoom. Battery capacity has increased minimally, but not enough to make any practical difference.

From the power efficiency standpoint, by far the most interesting new hardware feature is the new dual-antenna design, which eliminates the infamous “death grip” effect of the iPhone 4 and improves cellular reception in general. Combined with the new communications chip that boosts data rates (for GSM networks), the new radio setup is worth looking at—especially since we’ve seen that the radio can contribute even more to the energy equation than the display.

The runaway star on the software side with the iPhone 4S and iOS 5.0 is Siri. Reminiscent of a cross between HaL and the famous Star Trek “Computer,” Siri listens, seems to think, and generally does a better-than-expected job of carrying out your wishes. While still clearly early in the development cycle, Siri feels to me like the beginning of a paradigm shift where we may actually become just a productive without a keyboard as with one. I’ve dictated quite a few e-mail and text messages with Siri (sometimes while driving!), and accuracy is very good—or very bad. That to me is an indication of evolving and improving AI on the recognition side. And as is usual for Apple, the real genius of the Siri interface is the simplest part of it: You simply hold your phone up to your head to start talking to Siri. To everyone else, it just looks like you’re answering a phone call! I wish I thought of that…

Running the 4S though my usual battery (pun intended) of power tests running the Star Trek movie resulted in the following:

The results were surprising in several respects. First, the 4S seemed to be extremely efficient in the “Max Battery” mode. It played through the entire movie consuming just 0.6 Wh of energy. That would be more than 18 hours of continuous movie playing, although you can’t see much at the lowest brightness setting. This is almost a 40% improvement in energy efficiency vs. the iPhone 4! The new lower power A5 chip is likely at the heart of this result. These days video decoding is an almost-routine task, and can probably easily be handled on one of its cores.

Turning the display to full brightness (the “Max Brightness” test) shows the expected result. The energy cost of running the display at full brightness is about 0.6 Wh for the two-hour movie. The display on the 4S isn’t notably different than on the iPhone 4, so this is expected.

Turning up the sound to maximum (“Max Movie” mode) shows one other interesting change. On the iPhone 4, there was virtually no change in energy consumption between the runs with the sound muted or with the sound at maximum. On the iPhone 4S there is definitely a measurable difference, both in energy consumption as well as in the perceived loudness of the sound. In fact, with the sound at maximum, the tiny speakers in the 4S produced enough sound to make it annoyingly loud as I was trying to do some other work. I had to resort to my “manual” muting method (putting a piece of tape over the speaker) to conduct my tests.

Well, I’m about out of space and time for this post. Next time I’ll describe the results of testing the new iPhone 4S dual-antenna and modem chip setup. These results are very interesting.

In late breaking news, Apple just announced a software update, iOS 5.0.1, which among other things “fixes bugs affecting battery life.” No big surprise, as the complexity of today’s smartphones rivals any other computing platform—and dwarfs the others when it comes to power management. In particular, the interaction between hardware and software to minimize energy consumption is very difficult to model and predict, but the ramifications on battery life are immediate and sometimes ugly.

And by the way, if you are interested in learning more about low power hardware design, my colleague Josefina Hobbs is hosting a new series of short videos covering everything from introductory concepts to selected advanced low power design topics. Check them out here.

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon

Posted in Uncategorized | 2 Comments »