Beating A Dead Horse: Why Test Standard Definition Anymore?

The ability of a display to upconvert standard definition content (like a DVD, or many cable/satellite channels) was once a key component of its overall performance.

But now, most TVs do a reasonable job, and more important, I don’t think most people actually have any SD content.

Should we bother to continue testing it?

(I’m going to keep de-interlacing out of this conversation, as it is inarguably important, and can be tested/considered separate from upconversion.)

Let me first explain what I mean. Nearly all modern televisions are 1080p, which is to say their displays measure 1,920 pixels across, and 1,080 pixels vertically — a total of roughly 2.1 million pixels. All of these pixels are active, no matter what the resolution of the original source. So if you’re watching a DVD, which has a resolution of (roughly) 720x480, the TV has to scale or “upconvert” that image to fill the screen. It’s a lot like zooming in on an image on your computer, or if you want to go analog, sticking your face right into a magazine.

As the original source has a finite resolution, there’s a limit to what even the best electronic trickery can do. It can add (i.e. create) some sharpness in the image, but even the best-scaled image won’t look as good as your average HD signal. Other aspects, like noise, artifacts, and so on, separate the good scalers from the bad. Of the TVs and projectors I’ve reviewed recently, even the worst scalers have been better than the average scaler of 5 years ago. They all do a reasonable job creating detail, and limiting noise.

But here’s the thing: I don’t think most people these days ever connect a standard definition source, and as such, never use the internal scaler of their TV.

I bet I’m right, and I’ll gladly take your money. The two most common SD sources are — as mentioned earlier — DVDs and certain cable channels. If you’ve got a Blu-ray player, and you’re watching DVDs on it, it’s upconverting the signal for you. Unless you own a really old DVD player, nearly all produced during the past few years upconvert as well. If the BD or DVD player is upconverting, the TV is just displaying the image as it’s receiving. No scaling.

And what about cable/satellite sources? Presuming you’re paying for the HD channels, and have an HD-capable box (and if you don’t, I’m not sure you can claim to care about picture quality), the box is upconverting the SD channels to HD. Once again, the TV will just display the image, no scaling. There were a few cable/satellite boxes that passed the native resolution of the signal, but the vast majority will only output one resolution (hopefully you’ve set it to 1080i, or 720p if you have a 720p TV).

There are certain niche exceptions (like Nintendo’s Wii, VHS, LaserDisc, SelectaVision, whatever) that potentially live in people’s homes, but I’d argue that the picture quality of these SD sources is so poor to start with that even the best scaler isn’t going to do much for the image.

So with the TVs doing a reasonable job, and the vast majority of people never asking their TV to do any upconversion at all, why should we waste time testing it, and waste your time by writing about it?

What do you think?

X