1080i vs. 1080p
In the comments following my blog last week, Neil Richards asked a follow-up question that is the cause of much confusion. I wrote a bit about it in the comments attached to that blog, but I thought it deserved a more thorough treatment this week.
Here's Neil's question:
Please explain the differences between 1080i and 1080p. Is there a huge difference in picture quality?
This is actually a very complex question that requires a longer and more-technical-than-usual blog, so hang onto your hat. The terms "1080i" and "1080p" are most correctly applied to the signal that is sent from a source device (disc player, satellite receiver, etc.) to the display. Moving video images are generated by showing a rapid sequence of still images called frames, much like motion-picture film is a fast sequence of still frames. Each video frame is represented in a video signal as a series of horizontal lines, from the top of the screen to the bottom, each containing a small slice of the whole picture.
A 1080p signal sends all horizontal lines of each frame in a single pass (which is known as "progressive," hence the "p"), whereas 1080i sends each frame in two parts—the odd-numbered lines followed by the even-numbered lines. This is called "interlaced" because the lines in two fields are interwoven with each other to form a complete frame, hence the letter "i"; see the illustration at the top of this blog.
In both cases, the signal represents an image in which each still frame has 1920 pixels horizontally by 1080 pixels vertically, which is often written "1920x1080." In other words, each frame is made up of 1080 horizontal lines, each with 1920 pixels from one side to the other.
Many people, especially manufacturers, also apply the terms 1080i and 1080p to video displays, but this is misleading. A modern TV with 1920x1080 resolution can normally accept either type of signal and display each frame in its entirety. In fact, the electronics within such a TV must end up with a 1080p signal before the image can be displayed. If it gets a 1080i signal, it must convert it to 1080p internally, a process called, logically enough, "deinterlacing."
In some cases, the terms 1080i and 1080p are used to indicate the type of signal a display can accept. For example, a plasma TV with 1366x768 resolution can probably accept a 1080i signal but not a 1080p signal, so it's often referred to as a 1080i display. However, this does not mean it can show all 1920x1080 pixels in the signal—it must first deinterlace the 1080i signal, converting it to 1080p, then "scale" or downsize it to 1366x768.
Even more misleading, many of the early TVs with 1920x1080 resolution were touted as 1080p by their manufacturers, even though they could only accept a 1080i signal. This caught many consumers completely off guard when they tried to send a 1080p signal from, say, a Blu-ray player to a TV they thought could accept it, but the set actually couldn't, leading to much gnashing of teeth.
As to whether 1080i or 1080p is better, that depends on several factors, including how the material is originally captured, stored on a disc, sent to the display, and processed. Movies and many TV shows are captured on film at a rate of 24 frames per second (fps) and stored on a Blu-ray disc as 1080p at 24fps, which is often denoted "1080p/24."
When watching movies on Blu-ray, the best possible picture quality is achieved by sending a 1080p/24 signal from the player to a 1920x1080 display that can flash frames on the screen at a multiple of 24—48, 72, 96, or 120 frames per second. In commercial movie theaters, each film frame is flashed on the screen two or three times, depending on the particular film projector, resulting in a "refresh rate" of 48 or 72Hz. Few video displays can do this, but if you have one that does—such as a Pioneer Kuro plasma—it will provide the smoothest motion from Blu-ray discs played at 1080p/24.
Unfortunately, most HDTVs can display frames only at a rate of 60 per second. If your TV can accept a 1080p signal with 60 frames per second ("1080p/60"), it's relatively easy to derive this signal from 1080p/24 by repeating one frame twice, the next frame three times, the next one twice, the next one three times, and so on in a process called "3:2 pulldown" (sometimes called "2:3 pulldown"). This results in a 1080p/60 signal, which most modern TVs can accept and display.
Virtually all players can apply 3:2 pulldown to a 1080p/24 signal, and an increasing number of TVs can as well, though the process introduces a jerkiness to motion that is absent when 1080p/24 is displayed at a multiple of 24. Also, some players, notably most of the Toshiba HD DVD players (except the HD-XA2), first convert 1080p/24 to 1080i, then deinterlace that to 1080p/60, which results in a somewhat soft picture.
Many older HDTVs, even those with 1920x1080 resolution, are limited to accepting 1080i, which means the player must output a 1080i signal. In this case, the player starts by applying 3:2 pulldown to the movie frames. Then it discards the odd field from one frame and the even field from the next frame, repeating this process to end up with 60 fields per second in a 1080i signal. (A 1080i signal always conveys 60 fields per second, so there's no need to write "1080i/60" or, as some prefer, "1080i/30" to indicate 30 frames per second.) The combination of these processes—3:2 pulldown and interlacing—is sometimes called "telecine."
When the 1080i signal gets to the TV, it must be converted, or "deinterlaced," back to 1080p before it can be displayed, and the quality of the picture depends on how well this conversion is performed. It's not easy to deinterlace a film-based 1080i signal, since some of the paired fields are not from the same film frame.
The best way to do this is to discard the extra fields, pairing only matched fields into complete frames—a process called "inverse telecine"—and repeat frames in a 3:2 sequence to generate 60 frames per second. The end result is still jerkier than 1080p/24 displayed at a multiple of 24, but at least there are no mismatched fields trying to make a complete frame. In fact, there should theoretically be no difference between 3:2 pulldown performed on 1080p/24 frames and inverse telecine performed on film-based 1080i.
If a TV cannot perform inverse telecine, it must deal with the fact that two out of every five pairs of fields do not match when they are combined to form a frame. When the fields do not match, the most common solution is to discard one of them and create a new field that more closely matches the remaining field, a process sometimes called "vertical interpolation" or "vertical averaging." However, this reduces the visible resolution of the image, softening the picture.
If your display is limited to 60Hz, and it can accept 1080p/60, it might be better to send Blu-ray movies at 1080i and let the TV's processor do the deinterlacing. After all, which product is likely to have a better processor—a $3000 TV or a $500 player? Try sending 1080i and 1080p/60 from the player to see which works better with your particular system.
Regarding broadcast HDTV, most signals are 1080i. When a high-def station broadcasts a movie or other film-based material, it applies telecine, but some HDTV shows are captured at 1080i, which causes its own set of problems. For example, capturing fast motion at 1080i can result in "stairstep" or "zipper" artifacts, in which edges of a moving object appear jagged because the object is in a different position in one field compared with its position in the other field. Smoothing out these so-called jaggies involves vertical averaging, which some processors do better than others.
One note on LCD TVs with 120Hz operation, which double the 60Hz rate at which frames are normally flashed on the screen. If you send a 1080p/24 signal to such a TV, it can generally do one of two things: repeat each frame five times or create new frames to insert between the actual frames in the signal. Creating new frames is done in one of several different ways, and each manufacturer has its own name for the process. Creating new frames can result in smoother, sharper motion, but it can also generate artifacts of its own. In most cases, you can enable and disable new-frame creation and see which way looks better to you.
Bottom line: For the best picture quality on Blu-ray movies and other film-based material, send 1080p/24 from the player if you have a display with a refresh rate that is a multiple of 24. Also, send 1080p/24 if the display has a refresh rate of 60Hz and can accept a 1080p/24 signal. If the 60Hz display cannot accept 1080p/24, try sending it 1080p/60 and 1080i to see which looks better. If you have a 120Hz LCD TV, send it 1080p/24 and try enabling and disabling the "create new frames" mode to see which way looks better to you.
If you have an audio/video question for me, please send it to firstname.lastname@example.org.