Shopping Made Simple: HDTV Page 4

Display Formats Decoded

The HDTV video formats are identified by a number followed by a letter, such as 720p or 1080i. The number indicates how many scan lines, or pixel rows, are in each complete video image, or frame. The letter indicates whether the frames are progressive (p) or interlaced (i). A 720p frame is 720 pixels high by 1,280 pixels wide, whereas a 1080i frame is 1,080 by 1,920 pixels. (Pixels are the tiny "picture elements" that make up a video image.)

But 1080i video frames are interlaced, which means that each consists of two half-frame fields, one containing the odd-number pixel rows and the other the even-number rows. The 720p format displays 60 complete frames per second, while the 1080i format displays 60 fields of 540 x 1,920 pixels per second, which combine for 30 complete 1,080 x 1,920-pixel frames. Traditional analog TV, by comparison, is 480i format (480 x 640 pixels per frame, split into two 240 x 640 fields).

Because the number is bigger, you'd naturally assume 1080i is better than 720p, but it isn't so simple. The two formats actually deliver about the same amount of information per second to the screen, just arranged differently. For still images, 1080i will render more detail. But when there's lots of motion, as in sports, 720p excels.

Most sets display only a couple of formats, such as 480p and 1080i, and a few just one. So what do you do about the other formats? Either the display or the device feeding it (like an external HDTV tuner) will convert the formats that aren't supported by the display to one that is. Sets that can show only 480p or 1080i, for example, convert 480i to 480p (a process known as deinterlacing) and usually convert 720p to 1080i (scaling). But not all deinterlacers and scalers are created equal, and the quality of the processing can have a big impact on picture quality.

This is especially critical for displays that have fixed, nonstandard resolutions, like most plasma sets, which have to convert all incoming signals to match their native formats. These nonstandard resolutions normally are specified according to the pixel array, horizontal by vertical. For example, a 720p screen has a 16:9 array of 1,280 x 720 pixels, whereas a high-def plasma display might have 1,365 x 768.

That's an easy one. But is a 16:9 plasma set with 1,024 x 1,024-pixel resolution high-definition? Some would say it isn't because it has fewer than 1,280 pixel columns. We're a little more liberal. A 1,024 x 1,024 display actually has slightly more total pixels (and more pixel rows) than a 1,280 x 720 screen - you'd be hard pressed to see a difference. The industry-standard definition requires only the ability to display a 16:9 image with at least 720 horizontal lines (pixel rows), which would also make a 1,024 x 768 display high-def, but that's borderline. An 852 x 480 screen, on the other hand, clearly is not high-def.

You might also wonder about sets with 4:3 aspect ratios, since the definition above specifies a widescreen display. According to the Consumer Electronics Association, if a 4:3 set can letterbox a high-definition image (creating a 16:9 picture area with bars above and below) and display it with at least 540 progressive or 810 interlaced pixel rows, it can be considered HD-compatible. It's the shape of the image that matters, not that of the screen.

-Michael Riggs

ARTICLE CONTENTS

X