1080i vs. 1080p

In the comments following my blog last week, Neil Richards asked a follow-up question that is the cause of much confusion. I wrote a bit about it in the comments attached to that blog, but I thought it deserved a more thorough treatment this week.

Here's Neil's question:

Please explain the differences between 1080i and 1080p. Is there a huge difference in picture quality?

This is actually a very complex question that requires a longer and more-technical-than-usual blog, so hang onto your hat. The terms "1080i" and "1080p" are most correctly applied to the signal that is sent from a source device (disc player, satellite receiver, etc.) to the display. Moving video images are generated by showing a rapid sequence of still images called frames, much like motion-picture film is a fast sequence of still frames. Each video frame is represented in a video signal as a series of horizontal lines, from the top of the screen to the bottom, each containing a small slice of the whole picture.

A 1080p signal sends all horizontal lines of each frame in a single pass (which is known as "progressive," hence the "p"), whereas 1080i sends each frame in two parts—the odd-numbered lines followed by the even-numbered lines. This is called "interlaced" because the lines in two fields are interwoven with each other to form a complete frame, hence the letter "i"; see the illustration at the top of this blog.

In both cases, the signal represents an image in which each still frame has 1920 pixels horizontally by 1080 pixels vertically, which is often written "1920x1080." In other words, each frame is made up of 1080 horizontal lines, each with 1920 pixels from one side to the other.

Many people, especially manufacturers, also apply the terms 1080i and 1080p to video displays, but this is misleading. A modern TV with 1920x1080 resolution can normally accept either type of signal and display each frame in its entirety. In fact, the electronics within such a TV must end up with a 1080p signal before the image can be displayed. If it gets a 1080i signal, it must convert it to 1080p internally, a process called, logically enough, "deinterlacing."

In some cases, the terms 1080i and 1080p are used to indicate the type of signal a display can accept. For example, a plasma TV with 1366x768 resolution can probably accept a 1080i signal but not a 1080p signal, so it's often referred to as a 1080i display. However, this does not mean it can show all 1920x1080 pixels in the signal—it must first deinterlace the 1080i signal, converting it to 1080p, then "scale" or downsize it to 1366x768.

Even more misleading, many of the early TVs with 1920x1080 resolution were touted as 1080p by their manufacturers, even though they could only accept a 1080i signal. This caught many consumers completely off guard when they tried to send a 1080p signal from, say, a Blu-ray player to a TV they thought could accept it, but the set actually couldn't, leading to much gnashing of teeth.

As to whether 1080i or 1080p is better, that depends on several factors, including how the material is originally captured, stored on a disc, sent to the display, and processed. Movies and many TV shows are captured on film at a rate of 24 frames per second (fps) and stored on a Blu-ray disc as 1080p at 24fps, which is often denoted "1080p/24."

When watching movies on Blu-ray, the best possible picture quality is achieved by sending a 1080p/24 signal from the player to a 1920x1080 display that can flash frames on the screen at a multiple of 24—48, 72, 96, or 120 frames per second. In commercial movie theaters, each film frame is flashed on the screen two or three times, depending on the particular film projector, resulting in a "refresh rate" of 48 or 72Hz. Few video displays can do this, but if you have one that does—such as a Pioneer Kuro plasma—it will provide the smoothest motion from Blu-ray discs played at 1080p/24.

Unfortunately, most HDTVs can display frames only at a rate of 60 per second. If your TV can accept a 1080p signal with 60 frames per second ("1080p/60"), it's relatively easy to derive this signal from 1080p/24 by repeating one frame twice, the next frame three times, the next one twice, the next one three times, and so on in a process called "3:2 pulldown" (sometimes called "2:3 pulldown"). This results in a 1080p/60 signal, which most modern TVs can accept and display.

Virtually all players can apply 3:2 pulldown to a 1080p/24 signal, and an increasing number of TVs can as well, though the process introduces a jerkiness to motion that is absent when 1080p/24 is displayed at a multiple of 24. Also, some players, notably most of the Toshiba HD DVD players (except the HD-XA2), first convert 1080p/24 to 1080i, then deinterlace that to 1080p/60, which results in a somewhat soft picture.

Many older HDTVs, even those with 1920x1080 resolution, are limited to accepting 1080i, which means the player must output a 1080i signal. In this case, the player starts by applying 3:2 pulldown to the movie frames. Then it discards the odd field from one frame and the even field from the next frame, repeating this process to end up with 60 fields per second in a 1080i signal. (A 1080i signal always conveys 60 fields per second, so there's no need to write "1080i/60" or, as some prefer, "1080i/30" to indicate 30 frames per second.) The combination of these processes—3:2 pulldown and interlacing—is sometimes called "telecine."

When the 1080i signal gets to the TV, it must be converted, or "deinterlaced," back to 1080p before it can be displayed, and the quality of the picture depends on how well this conversion is performed. It's not easy to deinterlace a film-based 1080i signal, since some of the paired fields are not from the same film frame.

The best way to do this is to discard the extra fields, pairing only matched fields into complete frames—a process called "inverse telecine"—and repeat frames in a 3:2 sequence to generate 60 frames per second. The end result is still jerkier than 1080p/24 displayed at a multiple of 24, but at least there are no mismatched fields trying to make a complete frame. In fact, there should theoretically be no difference between 3:2 pulldown performed on 1080p/24 frames and inverse telecine performed on film-based 1080i.

If a TV cannot perform inverse telecine, it must deal with the fact that two out of every five pairs of fields do not match when they are combined to form a frame. When the fields do not match, the most common solution is to discard one of them and create a new field that more closely matches the remaining field, a process sometimes called "vertical interpolation" or "vertical averaging." However, this reduces the visible resolution of the image, softening the picture.

If your display is limited to 60Hz, and it can accept 1080p/60, it might be better to send Blu-ray movies at 1080i and let the TV's processor do the deinterlacing. After all, which product is likely to have a better processor—a $3000 TV or a $500 player? Try sending 1080i and 1080p/60 from the player to see which works better with your particular system.

Regarding broadcast HDTV, most signals are 1080i. When a high-def station broadcasts a movie or other film-based material, it applies telecine, but some HDTV shows are captured at 1080i, which causes its own set of problems. For example, capturing fast motion at 1080i can result in "stairstep" or "zipper" artifacts, in which edges of a moving object appear jagged because the object is in a different position in one field compared with its position in the other field. Smoothing out these so-called jaggies involves vertical averaging, which some processors do better than others.

One note on LCD TVs with 120Hz operation, which double the 60Hz rate at which frames are normally flashed on the screen. If you send a 1080p/24 signal to such a TV, it can generally do one of two things: repeat each frame five times or create new frames to insert between the actual frames in the signal. Creating new frames is done in one of several different ways, and each manufacturer has its own name for the process. Creating new frames can result in smoother, sharper motion, but it can also generate artifacts of its own. In most cases, you can enable and disable new-frame creation and see which way looks better to you.

Bottom line: For the best picture quality on Blu-ray movies and other film-based material, send 1080p/24 from the player if you have a display with a refresh rate that is a multiple of 24. Also, send 1080p/24 if the display has a refresh rate of 60Hz and can accept a 1080p/24 signal. If the 60Hz display cannot accept 1080p/24, try sending it 1080p/60 and 1080i to see which looks better. If you have a 120Hz LCD TV, send it 1080p/24 and try enabling and disabling the "create new frames" mode to see which way looks better to you.

If you have an audio/video question for me, please send it to scott.wilkinson@sourceinterlink.com.

Share | |
COMMENTS
Bill Z's picture

While i get the point about expecting a $2k tv to process better than a $500 player, a PS3 would seem to have substantially more processing power than my tv set and i would have expected it to process better. My tv is a 1366x768 plasma (pana 50px75u) and i typically have the PS3 output 1080p/60 to the television but i will check out the options and see if i can output 1080i and if there's a difference one way or the other. thanks for the article, great stuff as usual.

Eric's picture

I don't understand your conclusion at all. Most players, except the toshiba that you mention, are able to perform 3:2 pulldown on 1080p/24 material, resulting in a good 1080p/60 image. As soon as you send 1080i, you are at the mercy of your TV's deinterlacing, which, if you study the tests of TVs, is notoriously difficult. Only 40% of new TVs properly deinterlace film based 1080i and that # grows every year. A few years ago, 1080i processing was pathetic. If your TV is older than a few years, chances are it sucks at 1080i film detection and your best bet is to send it 1080p/60 if it can accept it. Most TVs do not accept 1080p/24 so your main recommendation cannot be followed by 95% of your readership. Once you interlace an image, you are at the mercy of your processing and this is why I am really confused by your recommendations. I thought I had this all figured out, I'm pretty sure I am correct, so your post has me wondering about the reviews of this publication.

Scott Wilkinson's picture

Where did you get that 40% figure? Also, you say that number increases every year, which means that more and more TVs can properly deinterlace film-based 1080i, so the problem is getting better, not worse over time. I agree that older TVs often have poor deinterlacing, but most of them cannot accept 1080p/60 either, so sending that signal is not an option in any event. In this case, the best results would be obtained with a good outboard video processor. I also agree that few TVs can accept 1080p/24, but that number is increasing as well; I've seen several 60Hz TVs that can accept 1080p/24. Please don't forget that I suggested trying it both ways?letting the TV deinterlace 1080i and sending 1008p/60 from the player if the TV can accept it? to see which way works better with the equipment you have.

Scott Wilkinson's picture

Dear readers, Paul Matwiy sent me an e-mail pointing out an error in this blog, which I verified. It had to do with my description of the way in which 1080p/24 is converted to 1080i. I have now corrected the error and updated the blog. I apologize for any confusion this may have caused.

Paul Matwiy's picture

I'd just like to support the conclusion that, if properly de-interlaced, sending a 1080i signal to a progressive display can provide a correct image, particularly if done over HDMI which keeps the data in the digital domain. One un-looked for side effect is an increase in video latency due to the processing times for interlacing and de-interlacing. This requires an adjustment of your processor's lip-sync parameter to re-establish sync with the picture. The only side effect I have noticed is typically in supplemental materials where the video will shift cadence between 24fps film and 30fps NTSC video. Occasionally, the cadence detectors can have difficulty switching for a few frames.

Bruce in CO's picture

Great info as always. I have an early Toshiba 1080i CRT, and will be getting Blu-Ray soon. Given the assumption that it doesn't have the processing horsepower to properly do any cadence adjustments, what should be output from the player to get the best picture? Do most Blu-ray players have a 1080i output?

Bruce in CO's picture

I should clarify - it's my TV that probably doesn't have the ability to properly convert anything, not the player.

Scott Wilkinson's picture

I assume the TV cannot accept a 1080p signal of any type, so your only option is sending 1080i from the player. (All Blu-ray players can output 1080i, so no problem there.) Since the TV is CRT-based, it most likely will not deinterlace the signal, but rather display the fields sequentially. This happens fast enough that your brain fuses the fields together, just as it fuses frames into apparently continuous motion, but I don't think it will ever look quite as good as a proper progressive signal on a good digital display.

Bruce in CO's picture

Thanks Scott. The old TV has served us well since the early days of HD, but it is approaching the time to upgrade to a new digital display. The picture is still nice, but is starting to lose its pop. However, its ability to show subtle details in scenes with shadows is still good and until recently neither plasma nor LCD could come close. Your reviews have been instrumental in convincing me that the newer displays (not all, but some) are beginning to solve the black level and shadow detail problem. Peak contrast is meaningless if the display can't resolve low level details.

Scott Wilkinson's picture

I couldn't agree more; shadow detail and black level are among the most important picture-quality criteria in my book, along with color accuracy. CRT does a great job in these areas, but the latest digital displays are looking mighty good. If you've got the scratch, the Pioneer Kuro plasmas are at the top of my list.

Charles's picture

I just purchased a plasma tv and this was listed in the specs: Supported TV Formats 1080i Native Panel Resolution: 1024 x 768 Does this mean it is a 1080i tv ? The rest of the specs can be found here: http://www.vizio.com/products/detail.aspx?pid=59 Thanks!

Scott Wilkinson's picture

Yes, it is a 1080i TV in the sense that it can accept a 1080i signal but not a 1080p signal. However, it's native resolution is not 1080 anything, so calling it 1080i is a bit misleading.

Ed's picture

thanks heaps for the explanation. So now being in Australia on PAL and 25fps. If I buy a blueray player will it be 1080p/24 or 1080p/25 ? If its 24 (which I assume it will be) does that mean some further video conversion occurs and if so causes some(slight) degradation going into a PAL tv. Or is it so transparent that for all intents and purposes there is no difference? Hope this question makes some sense as to what I am after.

D.Sypnier's picture

I know next to nothing about tv's etc., but I do have a new Samsung 1080P, however, all I see on my cable box or TV is 1080i. The cable company said that's the broadcast station and has nothing to do with my TV settings or cable box. I saw some 1080P broadcasts early on, but not lately. 1080i is just not as sharp as 1080P. Why buy a 1080P if you cannot get the transmition?

JB's picture

I'm about to buy my first LCD HDTV - want a good quality set, 1080p, that will do everything right and justify its purchase price. I'm hung up on the still-unanswered question of how to make standard 1080i cable or satellite broadcast material appear with top HDTV resolution. The just-released LG 42LH55 tv set I've considered failed the 1080i de-interlace test the CNET reviewer gave it. Many reviewd sets fail this test. I simply cannot justify buying a set that won't do a decent HD job with standard broadcast material. As others have stated here, de-interlacing remains a problem. How can this issue re standard broadcast material be dealt with? To extend D. Sypnier's question, why buy any HDTV at all if it can't properly present standard 1080i broadcast material?

Alonza's picture

What complete system would you recommend ? TV, DVD etc.

lloyd's picture

I have a Sony Wega 50inch 1080I {projection],connected to a Sony Blu-Ray.HDMI cords.I was told it will never be as clear as a 1080p because its projection.Its about 3 years old now. Salesman said it would put out the best picture.Great picture but not 1080p quality

X
Enter your Sound & Vision username.
Enter the password that accompanies your username.
Loading