Musings in HDMI 1.3 Video

HDMI 1.0 was introduced to the market in 2002. As a means of carrying both digital audio and video between the source and the display, it offered several advantages over competing technologies, the most prominent being IEEE 1393, commonly known as FireWire. HDMI carried both audio and video, and also offered alluring security advantages that appealed, in particular, to Hollywood.

As with any standard, however, many of the reasons for the adoption of HDMI 1.3 are murky and mired in technology, economics, and industry politics. But it's now well established, and is likely to remain the best (wired) means of conveying high definition video from one place to another in the home for the foreseeable future—and beyond.

But HDMI remains something of a moving target. There have been five updates since its introduction (all backward compatible). HDMI 1.2 is the latest fully implemented version. It can pass the most common standard and high definition digital video formats available to consumers, up to and including 1080p/60. It can pass Dolby Digital and DTS audio, plus PCM digital audio from CD, DVD-Audio, and SACD (SACD via HDMI is typically converted from its native DSD form to PCM in the player). It can also pass along other PCM formats, such as multichannel PCM from a high definition Blu-ray player.

But never assume specific hardware capabilities based on reading the specs for any version of HDMI. Remember that HDMI is merely a connection format with certain specified capabilities. The only things that those who established the HDMI standards can control are the HDMI transmitter at the output of the source, the HDMI cable (sometimes—not all HDMI cables have been tested for full compliance), and the HDMI receiver at the input of the display or switcher. HDMI has little control over what the source or display manufacturer chooses to do outside of those boundaries. An AV receiver manufacturer, for example, may choose not to make use of the multichannel PCM audio that arrives at its HDMI video switcher. Or a manufacturer of a universal DVD player may choose not to pass SACD via HDMI.

Always check to make sure that the capabilities you want from HDMI are available in the specific product you are considering. Your best source of this information is likely to be the manufacturer, not the salesman at your neighborhood Best Circuit Shack.

Now we have HDMI 1.3, and consumer confusion is increasing exponentially, even as many early adopters insist on this new format in their next AV purchase. At present (April 2007), however, only a few products on the market have HDMI 1.3 outputs, and none have HDMI 1.3 inputs. (Fortunately, and as noted earlier, all generations of HDMI are backward compatible!) This source-receiver disconnect exists because the HDMI transmitters and receivers are different devices, and the transmitters were available first. (The manufacturer of these chips is Silicon Image—the first HDMI 1.3 transmitter is the Silicon Image Sil9134 and the receiver the Sil9135, for those who are fascinated by such details.)

But what can we expect from HDMI 1.3 once it becomes more commonplace? More important, should you hold off on any purchase decision until HDMI 1.3's enhanced capabilities show up in products you're interested in?

Let's look at some of the claimed benefits of HDMI 1.3, and what they mean in practice. I'll start with video, and to keep this blog from becoming a novella will follow up with audio in a Part 2, coming to a computer screen near you very soon now.

HDMI 1.3 video promises increased bandwidth, support for color depths of up to 16-bits, support for the new xvYCC color space, and increased refresh rates, such as 120Hz.

Let's begin with that 120Hz refresh rate. It's just starting to show up in displays, and does offer benefits, including less jerky motion (no 3/2 pulldown required) and, in some types of displays, reduction of motion blur. Both benefits, of course, depend on the implementation. For example, if you feed the display a 1080p/24, film-based source from, say, a Blu-ray player, and the display frame quadruples it to 120Hz, no 3/2 pulldown will be needed. But the display might, instead, do its own internal 3/2 pulldown by converting the 24fps input to 60fps, and then frame double that to 120Hz. Is that a bug or a feature?

But I digress. Let's assume that a 120Hz refresh rate is a Good Thing. Keep in mind that source material is most often mastered at either 24Hz or 60Hz. No 120Hz sources exist; the 120Hz refresh is produced inside the display by converting the 24Hz or 60Hz input to 120Hz. No HDMI link in the system will need to carry 120Hz!

An HDMI 1.3 link at 120Hz might be useful if we start to see external video processors capable of generating refresh rates up to 120Hz and video displays capable of accepting these higher rates. (Just because a display can show the 120Hz signal that it generates internally does not mean that it will accept a native 120Hz signal at its input.) But in the final analysis few of us will ever have reason to pass 120Hz video through an HDMI 1.3 connection.

Bandwidth in a digital system is most often specified as a data transfer rate—the number of bits of data per second that it can pass. HDMI 1.3 will pass a maximum of 10.2 gigabits per second. The specified maximum data rate for Blu-ray disc, the fastest home video format available, is 36megabits per second. Faster speeds are possible and anticipated, but primarily for computer applications, not home video.

HDMI 1.3's super fast data rate capability is, therefore, way more than will be needed for any current ort anticipated consumer format. It's certainly future-proof, but even the data rate available on of HDMI 1.1 and 1.2 is far in excess of anything needed for any home video format available now, or anything likely to turn up in the foreseeable future.

Of course, if we add in the data throughput required by some of the other new features supported by HDMI 1.3, our data transfer rates could increase dramatically. How about HDMI 1.3's much trumpeted ability to provide greater color depth and a wider color space?

All consumer video formats are, at present, limited to an 8-bit color depth. But HDMI 1.3 supports 10-, 12-, and 16-bit color. Even moving up by just one step, to a 10-bit system, will increase the number of available colors from the 17 million available to us now to one billion. This new wrinkle is called "Deep Color." (The number of bits referenced here indicate bits per color. You'll often see articles or manufacturer's specs on this subject that refer to 24-, 30-, 36- and 48-bit color. Each channel in a 3-channel color system can carry 8-, 10-, 12-, or 16 bits. So those other references are talking about the same thing, but they are simply adding up the total number of bits required for the separate red, green, and blue channels. The bigger numbers are also more impressive in advertising copy!)

Deep color may be used within the current high definition (ATSC) color space. But there's also a new color space in town. It's generically known as xvYCC (for Extended YCC Colorimetry for Video Applications). You can bet, however, that they come up with their own proprietary names for it. Sony, in fact, already has: x.v.Color.

I recall a Sony xvYCC demo at last fall's CEATEC in Japan. They had two monitors set up, one with xvYCC, the other without. The source was nearby: a colorful, still life scene captured by a video camera. I suspect there was something not quite right with the setup—perhaps something in the chain did not preserve xvYCC's supposed benefits—because the monitor with the standard colors clearly looked more like the source! The xvYCC monitor didn't look bad. In fact all eyes were drawn to its clearly brighter and more saturated colors. Was it appealing? Yes. Was it accurate? No.

That doesn't mean that xvYCC can't produce both compelling and accurate color, only that getting there is full of pitfalls. And based on past history and current consumer sets, display manufacturers don't have a very confidence-inspiring track record in getting the color correct—even with current tehnology.

But let's assume that both xvYCC and Deep Color are great things—done right, they may well be—and that we want them. While they may be implemented independently, let's also assume, to simplify the discussion, that they go together like chips and dip. What are the odds that we'll be actually enjoying them soon on our home systems?

Not good. For starters, no broadcast standard includes them. And neither of the new HD disc formats has them in its specifications. In theory, either source could carry them if the creators of the programming chose to ignore the standards. But even after scaling this particular hurdle, what else might stand in the way?

For starters, broadcast bandwidth is limited. And even Blu-ray, the disc format with the most storage capacity, is a finite bit-bucket. And with xvYCC and Deep Color, even at a 10-bit color depth, we're talking about an order of magnitude more data for transmission or storage. One billion colors instead of 17 million. A wider color gamut. As for downloading an HD movie with this added data, got a week?

The only way to accommodate this added data within the available pipelines and storage devices would be to save on bits elsewhere. Reduced luminance resolution, anyone? Or how about highly compressed, multichannel MP3 audio? Instead of true high definition video, we'd have "HD-Quality" video. I think I've heard that one before.

Even if the storage problem could be solved—and some day it will, though not likely with our current storage and transmission devices--we must have a source that has both of these new color enhancements to begin with. This extends all the way back to the film-to-video telecine step for movies, or the video camera for video. The master must be created and preserved in at least 10-bit color, with a wider color gamut, and preserved in that form every step of the way to the user's HD receiver or disc player. Then the link to the consumer's display must be able to pass it, and the display must be able to make full use of it. As stated earlier, the only link in this entire chain that HDMI 1.3 can control is the HDMI 1.3 transmitter at the player or other home source, the HDMI cable (perhaps), and the HDMI 1.3 receiver at the display. Everything else is up for grabs.

The film studios have transferred most of their important titles to high definition video over the past few years. How many of these titles do you suppose have been mastered in xvYCC. Best guess: None. On the plus side, however, most HD video masters are produced in 10-bit video, which is currently downconverted to 8-bit for commercial release. So masters are available to support, at the least, Deep Color at 10-bit. 12-bits, however, is unlikely, and 16-bits is a technological pipe dream, even assuming that the human eye could see that many colors.

The bottom line is that most of the video benefits of xvYCC and Deep color are a very long way from commercial realization on normal film and video program material. Their use in camcorders, however, is likely, and xvYCC-capable camcorders are just starting to appear. And video games could provide a possible application as well, though I wonder if gamers will really appreciate deeper, richer colors as they blast that next alien coming around the corner! In my opinion the video benefits of HDMI 1.3 aren't likely to affect most of your home video watching for many years, if ever.

But HDMI 1.3's video capabilities may well affect the way manufacturers promote their products. Feeding any current video source (standard or high def) into a display with xvYCC and/or Deep Color engaged (will these features be defeatable?) will result in the wrong colors. They may be pleasurable (we haven't yet seen enough such product to say for certain) but they will still be wrong. But that won't stop manufacturers from trumpeting that their sets (or HD disc players), fully equipped with HDMI 1.3, produce colors that are "longer, lower, and wider."

X