Stranger HDR Things

High Dynamic range, or HDR, is perhaps the most exciting of the trio of improvements that Ultra HD brings to the table, the others being a wider color gamut and higher resolution. The images from a flat screen set pop off the screen in a way that the dimensional but often too dim 3D never could. And you don’t need special glasses to see it.

A flat screen set, capable of peak brightness levels of over 1000 nits (just under 300 foot-lamberts) can make the most of an HDR source. HDR program material is mastered for a peak output of either 1000 nits or 4000 nits, with most of that luminance reserved for bright highlights.

But not all displays can hit 1000 nits, and we know of no consumer products capable of anything beyond 2000. And not all displays can reach even 1000 nits; OLEDs, for example, generally top out at under 700, and cheaper 4K LCD/LED designs can’t even get to that.

There’s no tone mapping standard. We can tell you that tone mapping is almost universal, but not how it’s done.

If an HDR-capable set with, say, 500 nits of available peak luminance sees a scene that peaks at 1000 nits, what does it do with it? Absent any other processing it would simply clip all of the information above 500 nits. But to preserve at least the sense of that information, the set then “tone maps” it. How is this done? That’s where the fun begins, because the actual tone mapping process is up to the TV maker. There’s no tone mapping standard. We can tell you that tone mapping is almost universal, but not how it’s done—which in any case might require a Mensa membership to fully comprehend. If all sets were limited to a fixed peak brightness of, say, 1000 nits, and all sources limited to the same 1000 nits, tone mapping would never be needed. But with the wide range of product prices and performance on the market, that was never possible. So tone-mapping is with us and likely always will be until the peak brightness of our displays matches the peak brightness that can be mastered onto the source.

But the fun doesn’t stop there. HDR in projectors adds another whole layer of complexity to the mix. In fact, HDR was developed with consumer televisions in mind, not projectors. A home theater projector is unlikely to offer a peak brightness output of more than 150 nits, though the Epson 4000 I recently reviewed topped out at a “searing” 174 nits in Bright Cinema mode. But even that’s a hair on the dog of a good flat screen set’s peak output. That means that the tone mapping on a projector needs to begin at a far lower source level, and be far more aggressive. This makes a projector far trickier to set up and optimize in HDR than a flat screen set.

Ideally we’d have special “projection only” versions of HDR releases, mastered at a peak level of 150-200 nits. But that’s not happening. And there are other variables as well with projectors, including the size and gain of the screen. The only way to standardize for that would be to specify not only the capabilities of the projector but also everything else in the system. That’s unlikely to happen either. The only place you’ll find it is in one of the Dolby Cinema theaters, with about 100 of them scattered in multiplexes around the country. Dolby Cinema is a closed system; each piece of the puzzle, from content mastering to the screen, is fixed and known. No tone mapping needed there. But in our crazy world of ad hoc home theater, there are few such standards. HDR from a projector, for now, is the Forest Gump’s Box of Chocolates of the home theater world. You never know exactly what you’re going to get…but it usually tastes good.

X