Ultra HD a Cappella

The 2015 International CES is over, but the melody lingers on. The big news in video, of course, is that Ultra HD is coming to us like a great singer who is pushed out on stage knowing the tune but not the lyrics. The result might be a stirring vocalization of “Over the Rainbow,” but the only words the singer can think of are the lyrics to “Does Your Chewing Gum Loose Its Flavor on the Bedpost Overnight.”

In short, most (but not all) of the sets launched at the show still feature only one of the important features of Ultra HD: 4K resolution. As video guru Joe Kane has put it, today’s 4K sets are simply 2K sets with four times as many pixels. That may not be doing them full justice, but the other UHD goodies, many of them even more important than 4K, include a wider color gamut (DCI, otherwise known as P3, or even Rec.2020 instead of the current Rec.709 HD standard), a deeper color bit rate (10-bits per color instead of the current 8), and high dynamic range (HDR). There are also advocates for reducing the color compression, or color subsampling, from the current 4:2:0 to 4:2:2 or even uncompressed 4:4:4, though retention of 4:2:0 for UHD now appears to be the dominant position.

Most of today’s sets crow about their wide color gamuts, but that usually means only that you can select a wider than standard gamut in the menu. If you do that with virtually all of today’s 1080p program material, which was produced for the Rec.709 HD color gamut, the colors you get will certainly pop, but they won’t be accurate.

But when we finally get a reliable source of quality Ultra HD material on Blu-ray, and compatible players, the only way to get the full benefits these advancements offer will be an Ultra HD set that can accept this wider gamut and deeper bit depth. Most importantly, it must not only accept such a source at its inputs, but also pass them all the way to the screen, modified only by the set’s color controls (including calibration). At no point along the way should the wider gamut and higher bit depth be reduced inside the set (perhaps to avoid having to redesign some of the internal circuitry, such as those color processing controls), then upconverted again prior to reaching the pixels. The latter will quash any benefits derived from the source’s wider color gamut and higher bit depth.

The bottom line here is that any 4K set you buy today will provide 4K resolution from 4K sources and upconvert Full HD 2K material to a 4K pixel count. (Upconversion to 4K—technically we’re using 4K here as a stand-in for 3840x 2160)—does not produce true 4K resolution, but rather simply provides for a denser pixel structure that fills the 4K screen.) Such a set is unlikely, however, to make full use of the Ultra HD source material we expect to see in the next year or two (though the S UHD sets we saw from Samsung at CES do appear to come closest to this goal).

The same thing applies in spades to high dynamic range. This may be the elephant in the Ultra HD room. While the resolution and color improvements of Ultra HD are real, it’s HDR that will generate the biggest Oohs and Ahhs in the house. HDR, put simply, makes the image pop by brightening the portions of the image that should be brighter (reflections off shiny objects, for example). But it leaves other portions of the picture alone rather than brightening everything. HDR can only be done well, at present, by a LCD/LED or LCD/Quantum Dots set with full-array backlit local dimming. (OLED may have the chops to do it as well. But so far it’s exclusively a flat-screen technology; projectors need not apply, since they can’t currently replicate the local dimming required.)

Claims have also been made that HDR improves the black levels as well as punching up the highlights. But in my opinion that’s crediting HDR for an improvement that stems from the full backlit local dimming (or OLEDs) needed to make HDR work in the first place.

There are several HDR formats jockeying for position (Dolby Vision is, however, the grandfather of HDR). Without a standard it’s possible that not all Ultra HD sets with HDR will provide HDR with all HDR-encoded sources. HDR requires that the source material be properly processed (or graded, to use the technical term). And if that processing is not the same as the processing for which the set has been designed, the metadata in the source might trigger the set to revert to a non-HDR state. This could be an issue for consumer adoption of HDR if, say, Disney adopts Dolby Vision for its HDR Ultra HD Blu-ray releases. If TCL adopts Dolby Vision and Sony and Samsung some other HDR format, will the Disney discs then produce HDR only on a TCL set and not a Sony or a Samsung? Some type of format conversion might be needed to avoid this bottleneck.

In addition, some of today’s Ultra HD 4K sets can play back 4K material at up to 60 frames per second, while others are limited to 30. The latter is not currently a particularly serious issue, as most movies are shot at 24fps. But if we move to a higher frame rate in the future, or if sports broadcasting transitions to 60fps (unlikely any time soon, given the bandwidths involved) this might be a concern.

The bottom line here is that, in general, a 4K set you can buy today might well be a fine set that will satisfy you for years. But it’s also likely that it won’t do everything that Ultra HD promises down the road. A five-year old computer today is a dinosaur. While folks once kept their TVs for 10 or even 20 years, those days are long passed if you want the best that home video can offer.

rilid's picture

Great article! I'm a long time subscriber to S&V/Home Theater and have come to value this type of reporting. I planned to purchase my first Ultra HDTV this year but will now wait until someone offers these features (plus passive 3D) at a reasonable price. Hope others will do the same and send a message that putting half developed technology on the market (3D again?) won't fly.