Which displays have it and which don't.
The current top HDTV broadcast resolution is 1080i (interlaced). Most television and cable networks use it, including CBS, NBC, the WB, HBO, Showtime, HDNet, The Movie Channel, Starz HDTV, and others. What happens to this HDTV signal when one of the latest digital HDTVs processes it? Does it take the full 1,080 lines of transmitted resolution, change the signal from interlaced to progressive (called deinterlacing), detect and compensate for motion, and send it to the screen, as it should? Or does the display's processor cheat you out of seeing all the detail within the broadcast?