1080i v. 1080p

There has been a lot of concern and confusion over the difference between 1080i and 1080p. This stems from the inability of many TVs to accept 1080p. To make matters worse, the help lines at many of the TV manufacturers (that means you, Sony), are telling people that their newly-bought 1080p displays are really 1080i. They are idiots, so let me say this in big bold print, as far as movies are concerned THERE IS NO DIFFERENCE BETWEEN 1080i AND 1080p. See, I did it in caps too, so it must be true. Let me explain (if your eyes glaze over, the short version is at the end).

For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV, every modern TV is progressive scan (as in LCD, Plasma, LCOS, DLP). They are incapable of displaying a 1080i signal as 1080i. So what we’re talking about here mostly applies to people with 1080p native displays.

Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras). All TVs have a refresh rate of 60Hz. What this means is that the screen refreshes 60 times a second. In order to display something that is 24fps on something that is essentially 60fps, you need to make up, or create new frames. This is done using a method called 3:2 pulldown (or more accurately 2:3 pulldown). The first frame of film is doubled, the second frame of film is tripled, the third frame of film is doubled and so on, creating a 2,3,2,3,2,3,2 sequence. It basically looks like this: 1a,1b,2a,2b,2c,3a,3b,4a… Each number is the original film frame. This lovely piece of math allows the 24fps film to be converted to be displayed on 60Hz products (nearly every TV in the US, ever).

This can be done in a number of places. With DVDs, it was all done in the player. With HD DVD, it is done in the player to output 1080i. With Blu-ray, there are a few options. The first player, the Samsung, added the 3:2 to the signal, interlaced it, and then output that (1080i) or de-interlaced the same signal and output that (1080p). In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference. Check out this article for more info on that.

The next Blu-ray players (from Pioneer and the like) will have an additional option. They will be able to output the 1080p/24 from the disc directly. At first you may think that if your TV doesn't accept 1080p, you'll miss out on being able to see the "unmolested" 1080p/24 from the disc. Well even if your TV could accept the 1080p/24, your TV would still have to add the 3:2 pulldown itself (the TV is still 60Hz). So you're not seeing the 1080p/24 regardless.

The only exception to that rule is if you can change the refresh on the TV. Pioneer's plasmas can be set to refresh at 72 Hz. These will take the 1080p/24, and do a simple 3:3 pull down (repeating each frame 3 times).

Short Version
What this all means is this:

• When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none).

• There is no additional or new information in a 1080p signal from movie based content.

• The only time you would see a difference is if you have native 1080p/60 content, which at this point would only come from a PC and maybe the PS3. 1080p/60 does have more information than 1080i/30, but unless you're a gamer you will probably never see native 1080p/60 content. It is incredibly unlikely that they will ever broadcast 1080p (too much bandwidth) or that 1080p/60 content will show up on discs (too much storage space and no one is using it to record/film).

So all of you people who bought 1080p displays only to be told by the companies that you had bought 1080i TVs, relax. The TV will convert everything to 1080p. Now if you bought a TV that doesn't de-interlace 1080i correctly, well, that's a whole other story.

Share | |
COMMENTS
Zoltan Kovacs's picture

Thanks for the great informative article!Now i understand this whole 1080p stuff much more clearer!

John Pritchard's picture

Thank you!!!!!!!

John Higgins's picture

Matt, I took a look at the Microsoft site you mentioned and unfortunately all the clips are encoded at 24 fps. I'd have to agree with Geoff that there is little chance we'll see 1080p/60 in any great scale.

Matt's picture

Ah you're right. I'm blind. So it sounds like the only potential video source we'll see that could output 1080p/60 is video games.

Geoffrey Morrison's picture

If your TV doesn

Geoffrey Morrison's picture

I just realized that previous post wasn

Aron's picture

Hi Geoff. Good article -- it's important for people to understand that, in principle, you don't lose info. with 1080i. However, I think it's also important to emphasize that, in practice, deinterlacing is not trivial, and that therefore virtually all TV's (and progressive scan DVD players) introduce some degree of error into the process (even if it's not the kind of wholesale error that you uncovered in your recent article on bob vs. weave). Further, is it really the case that you don't lose anything if your TV deinterlaces perfectly? I understand that interlacing is even harder than deinterlacing. And in order for your statement to be true, the interlacer would also have to be perfect (which it may not be). So while interlacing and deinterlacing between source and display can be done perfectly in theory, in practice you're probably better off if you can go directly from a 1080p source to a 1080p display without introducing those two extra conversion steps.

TomH's picture

Great article, but how do I know if a TV de-interlaces 1080i correctly?

Aron's picture

Follow-up to my previous post: Suppose you have a TV that accepts 1080p/60, but not 1080p/24. In that case, 2:3 pulldown is unavoidable for film-sourced material. But is it at least possible to avoid interlacing/deinterlacing? I.e, will any of the next-gen HD-DVD or Blu-Ray players that output 1080p/60 implement 2:3 pull down using entire frames (rather than by creating interlaced fields, as is done in the current players)? I.e, I'm looking for a player that will do 1080p/24->1080p/60, rather than 1080p/24->1080i/60->1080p/60. OR--does this alternative approach for doing 2:3 pulldown without interlacing (again, with frames instead of fields) create its own problems?

Mike's picture

Loved the article. What about those of us that purchased the 780p DLP TVs that were top of the line last year? How can we take advantage of the additional lines of resolution of the new video formats? Are there any tips for the Toshiba 56hmx85

Jake's picture

I have a Vizio P50 720p plasma... I notice that if I output 1080i from the Xbox360 and Sony upscaler DVD player, the image "looks" better/sharper. How is that explained relative to this article? Does 1080i technically & visually look better on 720p displays? Curious to know! :)

Christian Guti's picture

It seems the best thing to do is buy a 72 hz tv.That is the best way to watch 24fps native content.SInce a lot of computer monitors accept that refresh rate, is weird that so few tvs do it.

Dave's picture

So tell me this, on both my LCD and CRT computer monitor I sending HD signal at a refresh of 72Hz. Now why doesn't the image look 'progressive' as some of the plasma/lcd/dlp sets at say Futureshop? The motion just doesn't seem as fluent even though I am running at the same or higher resolution!?This has been a huge question that has never been answered.. or that I have found :P

Geoffrey Morrison's picture

Wow, you all have really good questions. I'm going to do a new blog post to answer them.

fluppeteer's picture

1) I believe all HD-Ready branded HDTVs are obliged to support both 50Hz and 60Hz signals. The US *could* follow Europe, and play movies (slightly fast) with a 2:1 pull-down.2) You presume that a 1080p/30 (or 1080p/24) source will reach the TV unmodified. Having a TV accept a 1080p/60 signal means that one could use an external scaler with a different (more expensive) motion prediction implementation, which may well result in a smoother perceived picture - even though the extra information is "made up". Similarly, a television with 1080p/60 *can* use the deinterlacer/scaler in the decoding device, whereas a television which only accepts 1080i forces the owner to accept its implementation, which may be inferior.3) Although films are usually stored at 1080p/24, not all content has to be. Content aimed at the next generation disk formats could sensibly use 1080p/60 - this would allow the best reproduction on both 1080i and 720p/60 screens. Sony *do* have a HDTV camera that records 1080p/60.My $.02. :

Geoffrey Morrison's picture

Flup, here you go, in order:
1) I doubt manufactures are going to implement 50Hz modes in the States when it would be perceived as

Brian W.'s picture

In your article, you said:"For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV..."So what does this mean for those of us that DO have 1080i TVs? I own a 34" Sony Wega (model KV-34HS420). It is a CRT TV that supports 1080i, so I'm assuming that my type of TV is what you are talking about in the quote above.

Geoffrey Morrison's picture

If you do have a 1080i TV (either a direct-view CRT or a CRT RPTV), then you don't need to worry about any of this. Just send it 1080i and you're all set.

fluppeteer's picture

Thanks, Geoff.I'd not realized the ATSC hadn't forced 50Hz compatibility in US HDTVs while Europe was requiring both 60Hz and 50Hz. It would've been a way to ensure that programmes imported from Europe didn't need pull-up, and mean one could produce one television world-wide. Since they didn't, I can see 72Hz would appeal!Re. external scalers: true, I doubt most people will buy boxes just for that (although some will), but it doesn't mean your Blu-Ray player (which has the compressed stream motion vectors to play with) might not do better than the TV. Why isn't this a disadvantage of a 1080i input screen?Agreed 1080p/60 won't be broadcast/in cinemas, but new disks have spare room. 1080i on 720p and 720p on 1080i lose quality. Why not get the best from both? Space and bandwidth get cheaper; it's only 2x. This could do as much to increase quality on the wrong screen as bandwidth gains over broadcast HDTV or h.264. Once stored, it's worth displaying directly.1080i doesn't hurt *yet*, but it migh

fluppeteer's picture

Argh. Sorry, everyone. How does one insert line breaks?

Tony's picture

As a PC gamer I do want to use a largescreen TV for games. From the above it seems important to me that I get a true 1080p that accepts 1920x1080 resolution from my PC videocard (that is able to produce that).Any recommendations for me re DLP vs LCoS? (I figure LCD and Plasma are not suitable.)Any recommendations re how to get the best image from my computer onscreen?Thanks all. This is a very useful forum.

Geoffrey Morrison's picture

You want to look for not just a TV that will accept 1080p, but one that will accept a 1080p signal from a computer (on RGB, DVI, or HDMI). Not all will. They almost always say in the owners manual (which you should be able to find online).

hdtvrocks's picture

3:2 pulldown on 1080p/24 source to 1080p/60 will introduce judder. Something you can't get around if your display can only do 60Hz, BUT there are quite a few displays on the market today and in the future that do support mutiples of 24 like 48Hz or 72Hz. Most of the front projectors I've been looking at support mutiples of 24. Isn't saying "All TVs have a refresh rate of 60Hz" misleading?

Mel's picture

After reading the article and comments I'm still confused. Should or shouldn't the consumer population be laying out more money to purchase a set capable of supporting 1080p input or not? If it is not likely in the near future for source content to be in 1080p what's the big deal?

Geoffrey Morrison's picture

A 1080p input would be ideal, especially if you're a gamer. If all you watch is movies, and the TV de-interlaces 1080i correctly, then it

Mark's picture

A Question please from your average consumer. I am about to purchase a projector for my HT. I understand that 1080P presents all pixels simultaneously and 1080i scans and presents several times each second (3/2). Do you suggest buying the 1080P or 1080i?I plan on watching the new HD-DVDs (don't like Sony and won't buy their products ever) along with my present collection of DVDs. Will my old DVDs look better on the 1080P or will I only receive benefit from HD-DVD on the 1080P? The kids probably won't be using the projector for games to keep bulb life extended. After reading the article by Geoffrey I still have to ask that if the 1080i is converting the signal up to the 1080P standard will I really be able to recognize the difference? I plan on using a 120 inch 16x9 screen.Also, does the new 1080P give you less eye fatigue when viewing for extended periods because it may be a purer picture?Thank you.

Geoffrey Morrison's picture

Unless you're buying a CRT projector (which I doubt), then you'll be getting 1080p or 720p. Both are fine. No modern projector is interlaced.

Sonisame's picture

But there is roughly 44% more pixels in a 1080p display compared to a 720p display. That's why having a 1080p display(all digital displays') is ideal for a 1080i signal. Displaying 1080i on a 720p display is not as good as displaying it on 1080p displays.. Hence everyone is running to a get a 1080p display so that they can display a 1080i signal in it's true originality.

Bradley's picture

So, as I'm sure many of us are wondering... which is better? 1080p DLP, or 720p (768p) plasma like Samsung's new HP-S5072? If we want plasma, should we wait for 1080p plasma (although it probably won't be "affordable" for another year or two)?

Todd R.'s picture

I purchased the October HT issue at Barnes & Noble and found it to be very informative. I am going to subscribe. My old Mitsubishi RPTV crapped out on us, so we will be buying a new HD TV very soon. I was close to pulling the trigger on a Samsung HL-S5686W 720p model, but this whole 720p or 1080p dilemma has made me slow down and dive into the technology. Your stance of "You don't need 1080p" has made me flip back to serioulsly considering buying it (not to mention you have it as a favorably reviewed unit) on the website; however it is not listed as a PASS or FAIL on your list of tested 1080i deinterlacing tests. Do you know if this model passes or fails? Thank You for your expertise, and I look forward to reading HT for many years to come.

Pages

X
Enter your Sound & Vision username.
Enter the password that accompanies your username.
Loading
setting var node_statistics_86472 setting var node_statistics_86472