Video Measurements

The details on all things video.

I'm sure many of you read over the measurement boxes in our video reviews, take what you need from them, and move on. But what does it all mean, really? Why do we do it the way we do? For those of you new to the magazine or video displays in general, what does any of it mean? These are excellent questions.

Color Temperature
When someone says something is white (and they aren't implying honky), what does that really mean? Newton figured out that white light is all the colors of light combined. On a video display, white is a mixture of red, green, and blue light provided by the display's phosphors, color-wheel segments, or filters. Add these together in the right amounts, and what you see on your screen is white. In order to ensure that what the directors and postproduction houses see as white also looks white in your house, all monitors are supposed to be calibrated to D6500. This is an agreed-upon standard of certain amounts of red, green, and blue. This is also the color that a black body (or a theoretical object that absorbs and radiates all radiation equally) radiates when heated to 6,500 degrees Kelvin (or a balmy 11,240.33 degrees Fahrenheit). Just because something is 6,500 degrees Kelvin, though, doesn't mean it is D6500, but we'll get to that later. If your monitor isn't set for D6500, what was supposed to be white may appear bluish-white or reddish-white on your screen.

If the image is too blue, it is referred to as being cool, as everything appears to have a bluish tint. The "cool" designator is confusing, as the cooler the image, the higher its actual color temperature. (Blue has more energy than red and is therefore hotter—and therefore higher in degrees Kelvin.) If an image is said to be too warm, it looks too red (again, a lower actual temperature). Ideally, the dark portions of the image and the bright portions of the image will have the same color temperature. If they do, the display is said to have good gray-scale tracking. (Gray scale refers to the different steps of gray when transitioning from black to white in an image.) If gray scale is severely off, clouds may appear bluish, while the black car driving beneath them looks reddish. A difference of less than 500 K across the gray scale is probably not visible to most people.

To an eye not accustomed to an accurate image, D6500 may appear too red. This is because TVs usually come from the factory set extremely blue (10,000 K or more). A bright, blue image draws your eye on the store shelf, and what the consumer sees sells. The red push that some color decoders have was designed to give Caucasian skintones more color when the color temperature is set high (otherwise, people would look sickly). When you set the color temperature correctly, this red push makes white people look like they have a sunburn. We measure color temperature and color points with a Photo Research PR-650 SpectraScan colorimeter.

Color Points
Every display has slightly different color points, or slightly different reds, greens, and blues. Like color temperature, displays are supposed to have set red, green, and blue points so they can match what the director and post house intended. If these are off, colors may not seem realistic. Green may seem hypergreen or greenish-blue, and so on. Wildly oversaturated or dramatically off color points may have more wow factor, but a display with accurate color points is much more pleasing to the eye. This is true even for those not accustomed to an accurate image.

Color points can be off for a number of reasons, but the two most common are light output and market desire. Certain market segments may desire off color points (since every other display in the segment is off as well). On the light-output side, an accurate green, for example, can lead to a much dimmer picture. Light output sells, so manufacturers view dim colors as harmful. To single out DLP, one manufacturer told me that, when they experimented with an accurate green, they lost 20 percent of their total light output. Every technology can have accurate color points, but, for these and other reasons, manufacturers may just choose not to.

Contrast Ratio
This is a mostly useless statistic, but manufacturers were (and still are) getting so outrageous with their contrast-ratio claims, we had to start calling them on it. In a dark room, you would measure a display's black level with a 0-IRE signal off a DVD. Then measure the light output using a 100-IRE signal off the same DVD. Divide the light output by the black level to find the full-on/full-off contrast ratio. You would do this only after calibration. To find the ANSI contrast ratio, you'd use eight black boxes and eight white boxes arranged on the screen in a checkerboard pattern and measure each box. You would then average the black level of the black boxes and the white level of the white boxes. Then divide these averages just like the full-on/full-off measurement. Most displays have a much lower ANSI than full-on/full-off contrast ratio. The lens element, the power supply, and reflections in the cabinet all conspire to cause the ANSI contrast ratio to be lower.

The more useful numbers are the actual black level and light output numbers. We measure these levels with a Minolta LS-100, and the units are in foot-lamberts.

In theory, a display with a higher contrast ratio seems more punchy. A high contrast ratio is a good thing but alone tells you nothing. A display with a 4,000:1 contrast ratio but with a black level of 0.099 ft-L may not look as good in a dark room as a display with a 3,000:1 contrast ratio but a black level of 0.010 ft-L.

If you have any further questions on this—or any of our measurements—send us an e-mail at htletters@primedia.com

X