Watt's Up, Doc?

In response to my review of the new 60-inch Sharp Elite PRO-60X5FD LED-backlit LCD TV, a question arose in the online comments as to the power it consumes relative to the 60-inch Pioneer Elite PRO-141FD Kuro plasma I compared it to. The answer surprised me.

To test this, I used a Watts Up power meter (shown in the photo above) and a series of six stationary test patterns on a special Samsung test disc (not commercially available). These patterns consisted of a series of alternating full black and 100% white bars, two different forms of color bars, and full-field black, gray, and white. The results here are the averages obtained from all these patterns. This should be reasonably representative of normal use—whatever that may be.

I also performed the same measurements on two previously reviewed sets that are still at our main video studio (a different location from where the Sharp and Pioneer are, which is why neither of them were included in the Sharp review's side-by-side comparisons)—the Panasonic TC-P55VT30 plasma and the Sony XBR-55HX929 LED-backlit LCD. At 55 inches, both of these sets are slightly smaller than the Sharp and Pioneer.

The Sharp was first measured in its main review configuration—local dimming on and Intelligent Variable Contrast off. The Sony also had its local dimming engaged. All of the sets were measured as calibrated to provide their best performance, with peak white brightness levels between 25 and 30 foot-lamberts. All the results here are to the nearest watt.

LED-backlit LCDs:

  • Sharp Elite PRO-60X5FD: 54
  • Sony XBR-55HX929: 57

Plasmas:

  • Pioneer Elite PRO-141FD: 280
  • Panasonic TC-P55VT30: 237

With a full black-field pattern (which is included in the above averages), the readings ranged from a high of 59W on the Pioneer to a low of 29W with the Sharp. The Sharp's average power draw increased with local dimming turned off, but only to 94W, while its black field reading, at 31W, barely budged.

The power consumption of the LCD sets tested here may be fairly typical of LCD, if those sets employ local dimming. But that won't necessarily be the case with all LCDs, particularly older or less expensive sets that still use fixed brightness fluorescent backlighting. In any event, it was an eye-opening exercise in just how much more power plasmas draw compared with LED-backlit LCDs.

COMMENTS
maj0crk's picture

Thanks Tom. I believe I'm safe in saying plasmas draw more energy because the HAVE to, no because of any conscience decision made by a manufacturer.
That said, I had decided to buy the Panasonic VT. The fact it draws 4X the wattage the Sony XBR does gives me pause. Do I go green & pay the extra for the XBR or stick with the Panny? Thats a decision only I can make, but I thank you nonetheless for bringing this fact to my attention.

MatthewWeflen's picture

If cost is the only concern, it's actually pretty easy to figure out. With a small degree of local variation, power in the US generally costs about a dollar per watt, if you were to use that watt 24 hours a day, 365 days a year. So, for instance, the Panny VT listed above would cost $237 to run all year, while the HX929 would cost $57.

But of course, very few people watch tv 24/7. So you need to divide those numbers by your anticipated viewing. Let's say it's 6 hours a day (because they are beautiful TV's, of course :-) That means you'd pay about $60 annually to run the Panny and $15 to run the Sony. So over say a ten year set lifespan, you'd save $450.

maj0crk's picture

Thanks for doing the math for me, Matthew. My local Blockbuster has both the 55" Panny & Sony on sale for $1999 & $2695 respectively. Doing the math, I'm still better off w/the Panny cost-wise (factoring in energy costs). Johnny's remarks concerning the amount of heat coming off any plasma are most certainaly true. Bet my viewing room would be warm in winter, decreasing the need to heat the place.

sl1ac's picture

Power is not sold in terms of dollar per watt and if the Panny VT uses 237 watts the cost is NOT $237 for the year. You will still have the same 237:57 ratio between a plasma and LCD but your calculation on cost is wrong. Power is typically sold per kilowatt-hour (KW-hr). So if the the TV uses 237 watts on average and the TV was on 24/7 for 365 days then that is 2,076,120 watt-hr (2076.12 KW-hr). I'm in the mid-west so power on average costs $0.09 per KW-hr (different summer and winter rates). Therefore the cost of electricity is $186.85 if the TV were on all day for a year. For an average 6 hour use per day the Panny VT would cost you $46.71 for the year and the Sony would be $11.23 for the year.

johnnyd's picture

To Matthew, Plasma TV's are like powerful space heaters so you must also consider the amount of heat that home air conditioning must compensate for. Even in winter, a theater room can get very warm while using plasma TV's so you need to use electricity to compensate with fans. Edge lit LED TV's run very cool in comparison. Also if everyone cuts their home use of electricity, then new and very expensive power plants need not be built which helps to keep the cost per amp hour from rising.

MatthewWeflen's picture

Indeed, I noticed that. I used to have lamp-based LCOS rptv, which definitely heated things up during the summer. Now that we have an LED based television, heat output is basically completely negligible.

ByronServies's picture

Just after I bought my 52 inch Sharp Aquos LCD I hooked it up with a similar watt meter and discovered that the different settings for video mode (dynamic, movie, etc.) could change the amount of power used by the TV by hundreds of watts.

In full-on Best Buy mode, it chows down considerably more than reported here. I'll see if I can either find my notes or re-run my test.

johnnyd's picture

I'm guessing HT used a properly calibrated set for their measurements. Those settings would closely resemble the settings most people would settle on for daily viewing. The store mode is known as TORCH mode and I can't imagine anyone would ever use it.

MatthewWeflen's picture

I am aware that power is sold by KwH. The "$1 per watt-year" was just a rule of thumb I picked up in my internet reading. Of course there may be regional differences.

By your calculations, the savings would be $350 instead of $450 over a ten year span. So clearly $1 per "watt-year" isn't all that far off. 9 cents per KwH is a regional price in the midwest (where I also live). Power in CA costs more, for instance.

dgelman's picture

Yes, plasma displays draw more than LED backlit displays. OLED I assume will draw less in the future. However, these are a single component in a home theater.

There is a serious difference between power utilization between Class A amplifiers and Class D units. I'm sure it's more than 5 fold as well.

I'm pretty sure that ribbon based speakers draw a considerable amount of power compared with their horn counterparts.

There are HD DVR's that can draw more than plasmas, and they usually run 24/7.

I'm pretty sure that all makers are interested in improving the performance of their products. This now includes power utilization. It's a good thing.

Hopefully the performance levels of LED backlit LCD's will match plasmas one day and the choice will be clear cut. This will most likely keep engineers employed.

sl1ac's picture

I figured if someone wanted to know the cost of running electronics, it would be far more informative to actually calculate the cost rather than give out "rule" of thumb from "internet reading." Keep in mind one could deduce, if not familiar with power rates, that the cost to run the television would $1 per watt of use from your first explanation, which is not the case.

stephenix1015's picture

It becomes the end of the world, when one forgot his mobile phone at home or when the portative personal computer suddenly become out of order. -
Arthur van der Vant

X