More 1080p Questions
Asked by Aron:
Hi Geoff. Good article -- it's important for people to understand that, in principle, you don't lose info with 1080i. However, I think it's also important to emphasize that, in practice, deinterlacing is not trivial, and that therefore virtually all TV's (and progressive scan DVD players) introduce some degree of error into the process (even if it's not the kind of wholesale error that you uncovered in your recent article on bob vs. weave).
Also suppose you have a TV that accepts 1080p/60, but not 1080p/24. In that case, 2:3 pulldown is unavoidable for film-sourced material. But is it at least possible to avoid interlacing/deinterlacing? i.e, will any of the next-gen HD-DVD or Blu-Ray players that output 1080p/60 implement 2:3 pull down using entire frame (rather than by creating interlaced fields, as is done in the current players)? I'm looking for a player that will do 1080p/24->1080p/60, rather than 1080p/24->1080i/60->1080p/60.
Answer: You are correct, it is possible to lose something in the interlacing or de-interlacing process. It's going to be very slight unless it's done horribly wrong. While it would surely be ideal to be able to output the source at its native rate and display it at its native rate, for the vast majority of displays this isn't possible (due to technology and $$$). What I hoped to point out is that the difference between 1080p and 1080i is incredibly slight, for the most part. I had been getting emails from people who seemed to think 1080p was twice as good as 1080i, and for movies this just isn't the case. In fact, on most TVs I doubt you'd see the loss of anything from correctly done de-interlacing, given all the other things a TV can do to screw up an image. Does seem worth investigating, though. Perhaps an article in the future.
I believe the next BD players from Pioneer and Panasonic will skip the interlace step and just add the 3:2 to the 1080p/24 to get 1080p/60 directly. The Samsung, being the first player, is doing something rather wacky with that extra step.
Asked by TomH:
Great article, but how do I know if a TV de-interlaces 1080i correctly?
Answer: Unfortunately, there is no easy way for an end user to check this. You can check for your TV in this article or this article. With any luck, we'll do more of these on a regular basis. Also, we do that test to any 1080p display we get in here. Check back after CES, there may be some news about a way for you to do the test (but still not easily).
Asked by Mike:
Loved the article. What about those of us that purchased the 720p DLP TVs that were top of the line last year? How can we take advantage of the additional lines of resolution of the new video formats?
Answer: Just because your TV is 720p doesn't mean you won't be able to see the benefits of HD DVD and Blu-ray. This is a BIG question, and one I get a lot. If you can see a difference between HD and SD on your TV, then you will absolutely be able to see a difference between HD DVD/Blu-ray and DVD. If you like watching HD (and who doesn't), then you will love these formats, as they are the best looking HD you have ever seen. Vastly superior to the compressed crap we get on cable and satellite. A word of caution, though, the Samsung and Toshiba players have terrible 720p outputs, so you're better off sending your TV 1080i. Of course that brings up the whole de-interlacing question again…So check which looks better on your TV.
Asked by Jake:
I have a Vizio P50 720p plasma... I notice that if I output 1080i from the Xbox360 and Sony upscaler DVD player, the image "looks" better/sharper. How is that explained relative to this article? Does 1080i technically & visually look better on 720p displays? Curious to know! :)
Answer: I have found similar things in my testing of many TVs, though I'm not entirely sure why this is the case, as 720p and 1080i have the same bandwidth. There are many variables (including the source not doing a good job with 720p), but if it looks better with 1080i, stick with 1080i.
Asked by Dave:
So tell me this, on both my LCD and CRT computer monitor I sending HD signal at a refresh of 72Hz. Now why doesn't the image look 'progressive' as some of the plasma/lcd/dlp sets? The motion just doesn't seem as fluent even though I am running at the same or higher resolution!? This has been a huge question that has never been answered.. or that I have found :P
Answer: First, I want to be clear that resolution and interlacing are two completely different things. 1080i and 1080p are exactly the same resolution. This is more a difference of frame rate, really. 1080i has 30 different frames, and these are interlaced (split in half if you will), and flashed on the screen fast enough that your brain combines them into full frames. 1080p is (as far as we're talking) 60 different frames per second. So there are, as far as the format goes, twice as many frames as 1080i. What the last post discussed is that as far as HD DVD and Blu-ray are concerned, those extra frames are created by the player, and not in the source. So in this case there is basically no (Ok Aron, very, very slight) difference between 1080i and 1080p, as long as the de-interlacing is done correctly.
As far as your question goes, there can be a number of variables here. I'll take one that most people overlook. No matter what you refresh at, 24fps material (film) is not going to look as smooth as 30fps material (video). Ever notice that your local news looks a certain way, and movies look a different way, that's video v. film. You wouldn't want film to look as smooth as video, it wouldn't look right. Philips and their PixelPlus circuitry makes film look like video. Some people like this, I for one don't (thankfully, you can turn it off). The advantage of the 72Hz refresh is that it takes the stutter out of the motion that happens when you double certain frames and triple others. The most noticeable example of this is a slow pan across a wide scene (say a landscape). The camera will look like it's moving, then stutter, then move, then stutter. That's the 3:2. When you do a 3:3, it will be smoother, but never as "fluid" as if the same scene was shot on video. This is a good thing. Years of subconscious learning has trained our brains to equate film with fiction, and video with reality. Who wants reality at the movies? It's also why 1080p/24 HD cameras are so prevalent, as film (and most fiction TV) directors want to keep that look.
Keep the great questions coming.