Was I Wrong?

As many of you know by now, I appear as a weekly guest on The Tech Guy, a nationally syndicated call-in radio program hosted by Leo Laporte. During a recent show, I was explaining the difference between 1080i and 1080p, a confusing subject to be sure. Shortly after the show, I got a rather long e-mail from John Sullivan pointing out what he thought were mistakes in my explanation. I'll interlace my responses with his comments...

Listening to you on Leo's show, I usually feel that you know what you're talking about, but I think you were way off base today! You seem to be confusing fields per second with frames per second.

Your comment that, "a 1080i signal conveys 30 frames per second, whereas a 1080p signal conveys 60 frames per second," is (I believe) incorrect. The end result is displayed at 30 frames per second (fps) regardless of whether it was captured and transmitted as interlaced or progressive.

Nope—I am quite confident that my statement was correct, though incomplete. In fact, 1080p can convey 24, 30, or 60fps, but 30fps is rarely if ever used.

On most modern digital TVs, the end result is not displayed at 30 frames per second, but rather at 60fps. If a modern digital TV receives a 1080i signal, it processes the signal to end up with 60 frames per second. Exactly how it does this depends on whether the signal originated as 24fps film or 1080i video as well as the specific processor in the TV, which is too complicated to explain here.

The signal originates in the camera, which either scans the image progressively or interlacedly [sic]. I believe the correct explanation is that a progressive-scanned signal sends 30 frames per second, while an interlace-scanned signal sends 60 fields per second.

Most professional HD video cameras capture images at 1080i (60 fields per second) or 1080p at 24 frames per second (aka 1080p/24), the same frame rate as film. ABC, ESPN, and Fox use cameras that capture at 720p/60fps, and these networks broadcast at the same rate. Capturing and transmitting at 1080p/30fps is not normally done, and I know of no cameras that capture at 1080p/60fps.

In order for a progressive signal to display at the normal 30fps, the signal has to travel at 30fps. An interlaced signal has to travel faster, because in order for the final display device to re-compile the two fields into one frame, it has to buffer (store) the first field until the second field arrives. Therefore, in order to display 30fps, the transmission speed of an interlaced signal needs to be twice as fast—that is, 60 fields per second, minimum.

The difference is in the size of the "pipe"—that is, the bandwidth of the signal. An interlaced signal sending one field at a time takes less bandwidth than a progressive signal sending a full frame at a time.

First, as I discussed above, the "normal" display rate these days is 60fps, not 30fps. Otherwise, I understand what you're trying to say here, but I think your statement that an interlaced signal must travel "faster" than an equivalent progressive signal is misleading. Video signals travel at the same speed regardless of whether they are interlaced or progressive; in fact, over-the-air and satellite broadcast signals travel at the speed of light.

However, you are correct that it's really a matter of bandwidth, which determines how much data can sent in a given amount of time. This is not the same thing as speed, which is defined as how far something travels in a given amount of time. I realize that the word "speed" is often used to denote bandwidth, but I object to this usage as imprecise.

As I'm sure you know, interlacing was invented to cut half the bandwidth required to send and receive a video signal. This allowed CRT TVs to display 30fps by "drawing" the first field (all the odd-numbered lines of each frame sequentially from the top of the screen to the bottom) followed by the second field (all the even-numbered lines). Our brain's persistence of vision melded them together into 30 full frames per second.

Normal TV sets refresh at 60Hz, whether you're viewing a 60-fields-per-second interlaced signal or a 30-frames-per-second progressive signal. There is no 60fps TV signal that I know of.

You are correct that most TVs refresh the image at 60Hz (times per second), but modern digital TVs refresh entire frames at this rate, not fields. (High-end LCD TVs refresh at 120Hz, displaying 120 frames per second.) As for 60fps TV signals, they do exist—ABC, ESPN, and Fox broadcast at 720p/60—but no one broadcasts 1080p at 60fps or any other frame rate. Instead, HDTV signals—other than those from ABC, ESPN, and Fox—are broadcast at 1080i.

On the other hand, Blu-ray players can easily send 1080p/60. But movies on Blu-ray are normally stored as 1080p/24, so to output 1080p/60, the player must apply 3:2 pulldown, repeating one frame three times, the next frame twice, the next three times, and so on. Thus, the player does not output 60 unique movie frames per second, but it does output 60 full frames per second.

However, I'm not the expert that you are, and things may have changed since I learned TV repair back in the 1960s. If I'm wrong, feel free to enlighten me.

Yes, things have changed a lot since the 1960s! I hope I've enlightened you.

If you have an audio/video question for me, please send it to scott.wilkinson@sourceinterlink.com.

X