Quantify The Energy Consumption In Your Mobile Devices

March 30, 2011
Smart phones are among today’s hottest Internet devices. Their combination of processing power and wireless communications capability is redefining not only how we think of computing, but also how we think about energy consumption.

Smart phones are among today’s hottest Internet devices. Their combination of processing power and wireless communications capability is redefining not only how we think of computing, but also how we think about energy consumption. After all, having a device in your pocket that can stream video or make a video call does you no good if the battery is dead.

Designers, then, need to take an external “product level” view of the smart-phone form factor to determine energy use and gain insight into further improvements in power efficiency. By setting up the infrastructure for analyzing power consumption at the end product level, you can immediately see some surprising results in terms of energy usage.

For example, consider one smart phone powered by a 1-GHz processor with 32 Gbytes of storage that squeezes 960 by 480 pixels into a 3.5-in. display. Its 26-g battery offers 5.25 Wh (Watts/hour). To establish any reasonable metric, though, we need a representative test, reliable measurements, and repeatability.

Since we wanted to gather general energy usage information on a broad range of our device’s subsystems, we used video playback as our test and chose a full-length movie of around two hours for our test vehicle. Too short a clip would have limited the accuracy of our data since the exact energy usage would be harder to measure, and other small energy “leaks” could have clouded the data. A much longer movie might have fallen outside of our ability to gather data on a single charge without extrapolation. It also would have extended our experiment significantly.

To Boldly Go

Our test vehicle of choice, the latest Star Trek movie (two hours, six minutes, 46 seconds), is just about the right length—and mighty entertaining as well!

Measuring the battery usage of our device was simple. We started by using its built-in battery-monitoring function to monitor its remaining battery capacity. The built-in battery monitor displays battery capacity with 1% resolution. That’s pretty reasonable for our purposes, but there are obvious concerns regarding accuracy and linearity of the displayed capacity, as well as repeatability of the measurements.

To address these issues, we started by running the test movie repeatedly on a fully charged device and recording the starting and ending battery percentages until the battery-warning message was displayed. To determine a solid baseline, all tests were run with as many external factors turned off as possible—displays at minimum brightness, volume muted, and “airplane mode” enabled to power down all radios. (Table 1).

All of the measurements were taken at least twice, with nearly identical results. Linearity across the range of battery capacity was very good, and only the initial measurement taken at full battery charge varied by more than 1% of battery usage across all of the tests.

In addition, the battery drained nearly linearly through the entire two hours and seven minutes of the movie except, again, during the first viewing after a full charge. In this case, the battery meter remained at 100% longer than expected, but resumed linear behavior after dropping to 99%. Linearity was maintained all the way down to the 20% battery level indicator warning, and even beyond.

Based on this data, we adopted a simple measurement strategy. First, we took all measurements between 99% and 20% of the displayed battery capacity. Second, we repeated measurements at least twice for each data point.

This established the baseline energy consumption numbers for our test device. It consumed 22% of its battery, or about 1.1 Wh, to view our test movie. This assumed, of course, that the battery was performing at factory specs and hadn’t degraded appreciably in the six months or so since the device was new. In addition, the baseline measurement represented the energy required to run the device and play the movie at minimum brightness, with no sound, and all other communications disabled—not exactly a test of everyday usage.

Results

With the metrics and methodology set up, and a reasonably accurate baseline determined, we could finally start measuring the energy consumption of some of the subsystems of interest.

The device’s high-resolution display is very good, but even so, watching a two-hour movie on a 3.5-in. screen is a pretty serious compromise. Still, watching the full movie gave us a good vehicle for measuring energy consumption, and juicing up the display was a very interesting test.

By rerunning our experiment with the display set at maximum brightness on top of our baseline configuration, energy consumption jumped up to 1.7 Wh. That translated to power consumption of around 300 mW to power the display at full brightness versus minimum brightness.

Compared to the display, we expected sound output to be a much smaller factor than the display. By rerunning our Star Trek test with speaker output at maximum as well, the additional energy consumed during the movie was even smaller than expected—it was beyond the accuracy of our setup!

This was a little surprising. Certainly, the impact of the movie was enhanced greatly by the addition of sound. But based on the data, the energy cost of the added sound was near zero (or at least smaller than the 1% of battery capacity level of accuracy in our current setup). Additional experiments showed that the energy cost of sound through a pair of Shure E5c earphones was also negligible.

Communications Are Key

Next we moved into the main realm of mobile Internet devices—wireless communications. Starting with our sound test setup, the next test involved viewing the movie with sound through a pair of Bluetooth headphones.

I used a pair of Altec Lansing Backbeat 903 stereo headphones and was impressed with the movie experience, as well as the energy dissipation. Listening via Bluetooth only added about 100 mWh of additional energy consumption, which translated into Bluetooth total power of around 50 mW.

Finally, we were ready to the take on the elephant in the room. Instead of viewing Star Trek stored locally on our phone, we streamed the movie from the Internet to get an idea of the energy cost of broadband communications.

We started by streaming via Netflix over a Wi-Fi connection. The energy cost was around 600 mWh, or 300 mW. The movie was still quite watchable, though not nearly as eye-popping as the locally stored version. Just for the record, the streamed version of the movie was around 350 Mbytes, compared to 1.96 Gbytes for the local version.

Our last test is the one we’ve all been waiting for—what’s the energy cost of watching Star Trek while you’re really mobile? For this test, we streamed the movie via Netflix, using the 3G data connection to the Internet. And here’s where things got really interesting.

With “five bars” of signal strength as indicated on the device (signal strength is notoriously flakey), we saw total energy consumption of 2.8 Wh to stream and view our two-hour movie. That’s just 1.1 Wh over our baseline of 1.7 Wh (with display and sound at maximum).

So in the best (measured) case, streaming two hours of video (about 350 Mbytes) cost just 1100 mWh, or 550 mW. Running the same experiment with “three bars” of signal strength consumed a total of 3.4 Wh. Now the cost of the 3G data connection is equal to the total energy consumed by the device, including the display and all processing.

In our “worst case” test, we streamed the movie over a 3G connection with only “one bar” of service. In this case, total energy consumption was 5.3 Wh, which is greater than the 5.25-Wh battery capacity of the device. That’s right. This data point had to be extrapolated. With very low signal strength, our fully charged smart phone would be fully drained before finishing Star Trek!

One additional note—signal strength itself is difficult to measure. As displayed on the device, it is a time average, and whether correlating with signal “bars” or the numeric dBm displayed in field test mode, the numbers and bars can change drastically based on local positioning, device orientation, or even the infamous “death grip.” Our experimental data for these “signal strength” tests were much less reliable and repeatable than in the earlier baseline tests, but the trend was clear and consistent.

Summary Of Data

We can now see a pretty clear picture of energy consumption in our test device. The display, as expected, is one of the main energy consumers. However, it is surprising that sound was an insignificant contributor to the equation. Communications turned out to be the biggest energy consumer of all, but the really big surprise here was the magnitude of the energy usage increase with decreasing signal strength. In the worst case, more than two-thirds of the total energy consumed is used just for communication (Table 2).

So what have we learned in our attempt to take a high-level look at energy consumption and power efficiency?

First, by carefully setting up the infrastructure and making careful, differential measurements, it’s possible to get very specific and accurate results about the energy consumption of key subsystems of these mobile Internet devices. Even the impact of the notoriously difficult to measure “signal strength” bars can be clearly seen in the data.

In addition, even this initial data suggests a couple of product-level tweaks to maximize the overall product’s impact, especially for watching movies. On the hardware side, improving sound quality might be a good investment. The current speakers are miniscule and produce adequate, but certainly not engaging, sound. Investigating new technologies to generate immersive sound with physically small drivers would be worthwhile, and there’s plenty of room in the power budget for improvement here.

But perhaps the biggest power “loophole” centers on wireless communications and power efficiency, which could even be addressed completely at the software level. A software solution that intelligently manages wireless data transmission depending on signal strength, location, and history could be much more efficient, especially in the worst case of “low signal strength.”

By limiting retries and including local coverage data in the computation, there’s a big opportunity for improved device performance in low-coverage areas. It simply doesn’t make sense that the only answer to your phone battery being completely drained as you drive through an area of low signal coverage is to turn it off.

Sponsored Recommendations

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Bidirectional power for EVs: The practical and creative opportunities using power modules

March 18, 2024
Bidirectional power modules enable vehicle-to-grid energy flow and other imaginative power opportunities. Learn more about Vicor power modules for EVs

Article: Tesla commits to 48V automotive electrics

March 18, 2024
48V is soon to be the new 12V according to Tesla. Size and weight reduction and enhanced power efficiency are a few of the benefits.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!