Home on the High Dynamic Range

High dynamic range is still the hot tamale in today's video world. While it's been around for a few years now, nothing new in video has yet made HDR yesterday's news. Any flat-panel TV, apart from the lowest of the low budget models, can do some form of HDR. But since peak brightness is a key component for effective HDR, a budget set can be seriously restricted in how well it can do the job.

The most serious HDR limitations occur with a projector. It's no secret that projectors aren't nearly as bright as any decent flat-panel TV so it takes a lot of creativity to design one that can accept and display high dynamic range images. Any display with limited brightness (cue in your favorite projector's theme song here) must deal with HDR source material that in many cases contains peak white levels much brighter than what a projector can properly display.

One of two things then happen: the projector either clips off the bright elements it can't show (often with ugly visible results), or it uses a process called tone mapping to fold down the information contained in that excess brightness into a form that the TV can display. Though "fold down" isn't a technically precise description of what's happening, it suggests how tone mapping makes it possible to retain some of the HDR benefits in the original content. Put less elegantly, you're trying to put a square peg into a round hole. For the square peg to fit, something's got to give but hopefully at least some semblance of a peg remains.

Tone Mapping and Metadata
The video display uses "metadata" included in the source to perform tone mapping. Two key elements are typically included in that metadata: MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Frame Average Light Level). These are fixed through the entire program and cue the display as to what it has to do to tone map. If a display can produce 1,000 nits of brightness and the source never goes above 1,000 nits, no tone mapping is needed. But virtually all HDR-capable displays need to tone map at least part of most HDR source material.

Tone mapping will be either static (the same across the full running time of the program) or dynamic (varying constantly with the program). Dynamic tone mapping is the most sophisticated form of tone mapping and it’s triggered by Dolby Vision and HDR10+ program material that includes corresponding dynamic metadata compatible sets can read and respond to. (If a TV or projector is not compatible with Dolby Vision or HDR10+, it merely strips it down to its HDR10 base layer and handles the program the same way it would standard HDR10.)

Ordinary HDR10 (the most common form of HDR) can do dynamic tone mapping only if the display includes a special form of video processing that can take the static metadata in the source and convert it to a dynamic format that enables the TV or projector to perform dynamic tone mapping on a frame-by-frame or scene-by-scene basis.

As noted above, no projector we know of can even approach the high peak light level of a competent flat-panel TV. But there's a way to get around this without extensive tone mapping. The most specialized case I know of is Dolby Cinema. In this theatrical format, where the screen and projection distance are known quantities, the video source is processed to ensure it never exceeds the capabilities of the projector. Peak brightness off of a Dolby Cinema screen is unlikely to exceed 100 nits, or about 30 foot-Lamberts (how brightness is commonly expressed in the commercial film production and theater business).

So it's no surprise that HDR is more limited on a home video projector than on one of today’s super-thin TVs. The saving grace here is that our vision responds logarithmically to light levels. The need for tone mapping can make calibrating a home projector for HDR quite a challenge. In my experience so far, using color meters alone is a start but I’ve found that educated, subjective tweaking can further improve the visible results. In other words, the sophisticated data processing used to perform dynamic tone mapping isn't always the answer, but instead might be trading off one advantage for another.

I recently reviewed a true 4K projector with dynamic tone mapping and the images it produced looked less crisp than those from an older 2K projector that uses pixel-shifting to get to perceived 4K resolution. The difference was small, but once seen couldn't be unseen. Why might this be? The 4K projector must process all of the 4K data together to perform tone mapping and other chores. The pixel-shifter, on the other hand, displays half of the pixels first, then after slightly shifting the image (by less than a pixel) displays the other half. I'm taking a wild leap here, but might that be the reason for the marginally crisper image from the pixel-shifting projector? Granted there are other variables involved (including the steady advances in processing power), but that's my theory — for now.

COMMENTS
3ddavey13's picture

I have a question regarding your review of the JVC DLA-RS1100 4K projector. You mention using a screen with a gain of 1.3. Does this mean that the measured peak brightness would be 30% dimmer on a 96" screen with a gain of 1.0? What I'm trying to find out is there any way to determine without having to experiment what is the highest possible screen gain you can use, for a given projector, that retains an unmeasurable black level in order to get the brightest picture possible without compromising contrast? I would like to replace the 75" 4k tv in my theater room (light controlled) with a native 4k projector that also does 3D, and this model fits the bill. My room is 12.5' wide so a 96" screen is about as big as I can go (I have tower front speakers which I refuse to part with). Hopefully my 6.5' ceiling height won't be an issue. One last question. Is there any chance of a future upgrade to Dolby Vision from JVC.

X