UHD Blu-ray vs. HDMI: Let the Battle Begin Page 2

With this in mind, I performed a test to see which of the supplied cables could handle these high bandwidth signals. I had 10 cables total, including six from AudioQuest (from their Forest and Carbon lines, at different lengths), then one each from Monoprice, Wireworld, MyCableMart.com, and Monster. Lengths ranged from 25 feet to 35 feet. To conduct the tests, I started by using the Samsung home splash page menu syncing as the benchmark. Then I queued up a 4K/60Hz YouTube video—if the cable was able to play a 5 minute/20 second video about Costa Rica without any dropouts it would receive a passing grade. I then also tested each cable on the less bandwidth-intensive 2160P/24Hz resolution of a UHD Blu-ray disc. Each test was conducted twice—once through my pre/pro and the other direct to the projector. My findings are summarized in this table:

Through my testing, I discovered a couple of things. First, some cables hooked directly from the Samsung to the JVC projector would not sync. In each case, these were the active cables that have a transmitter on the source side and a receiver on the display side of the cable. But when these same cables were used through the Marantz pre/pro, they were at least able to lock on to the signal temporarily (with varying degrees of “sparkles” on the display and dropouts occurring during the YouTube 4K/60Hz video). This leads me to believe that the 5-volt output that the Samsung is spec'd to provide via its HDMI port may be underpowered, considering these cables did work when connected through the pre/pro. I don’t have the equipment necessary to test this theory, so it's still conjecture. But that's a logical conclusion.

Of all of the cables, I was particularly impressed with the Audioquest models for a number of reasons. First, the build quality is extremely impressive. They provide a snug fit in the HDMI port and both the Forest and Carbon cables were able to lock-in on a high-bandwidth signal at 8 meters (26 feet) length, which is very impressive. At 10 meters, the Forest was able to negotiate a handshake for an instant, but there were ample “sparkles” on the screen and it would lose connectivity every 60 seconds or so. That wasn’t the case with the 10 meter Carbon cable—it was able to sync and never lose it. Audioquest chalks this up to the fact that the Carbon uses 5 percent silver in and on its conductor (versus 0.5 percent on the Forest), and the added silver does make a difference. But silver isn’t cheap—which is why the Carbon cables cost more than three times the Forest cables at my tested lengths, peaking at $889 for the 10 meter size. That's more than twice the cost of Samsung's UHD Blu-ray player at its current $399 asking price. I guess you get what you pay for—in this case, it’s silver.

Can't Live With It, Can't Live...
In the end, I’ve come to the following conclusion: HDMI sucks. First, it’s a pain to pull through walls and conduit given its large connector. Second, it comes unplugged too easily. And third, it’s not very install friendly over longer lengths because the signal degrades rather substantially.

My testing highlights the worst case/most-demanding scenario for HDMI bandwidth with signals of 3840/2160P YCrCb 4:4:4 60Hz—which we really won’t see in commercially produced material for years to come, so I wouldn’t panic just yet. On the plus side, the rear of my pre/pro isn’t nearly as cluttered as it used to be—I’m not sure that’s a fair trade, though!

In closing, I’d like to point out that what I could be facing here is an EDID (Extended Display Information Data) issue specific to my projector that's rearing its ugly head. Samsung has told me that its player’s default output for its home splash screen is supposed to be 2160P/60Hz 4:2:0 8-bit, but the JVC may be requesting a 2160P/60Hz 4:4:4 8-bit signal in the handshaking process and the player is just honoring the request. I've contacted JVC about this as well, and they're looking into their projector's interaction with the new Samsung player. Before I go through the process of snaking a new (and potentially expensive) cable through my wall and ceiling, I’m going to wait a bit and hope that firmware updates from JVC and Samsung will resolve this issue. If not, well, competitive players are said to be due this year from several other manufacturers. Let the UHD Blu-ray upgrade cycle begin!

Update: Following posting of this story, we were contacted by David Salz, president and founder of Wireworld, saying that the company has recently changed the active chip in the Starlight 7 cable used for this test from the older 10.2 Gbps version to a new 18 Gbps version with full HDMI 2.2 capability.

ARTICLE CONTENTS

COMMENTS
brenro's picture

I despise HDMI. If it were designed more for passing a signal instead of copyright protection there would be no problems. Most AV gear can be connected with component video and analog audio so why do we need this extra layer?

David Vaughn's picture
Totally agree.
Mark Tiras's picture

Another thing that I hate is that you have no idea what you are buying & installing a product that the package/item does not tell us if we are getting a 1.4 or a 2.0 item. No markings on the product or packaging. This should be against the law of the electronics standards. The cable should have markings on the cable of HDMI & which series it is, & at least what it can do & a manufacturing date. What is happening is the manufacturers can dump old products on installers & buyers that can not know the what series they are installing. Should be against the law.

Deus02's picture

Maybe I misinterpreted the information I saw, but, during the CES show, in viewing several videos on Youtube interviewing a number of "experts" from the various companies, I understood that the new UHD players would be able to automatically adjust their output depending on the connections and monitor's video capability. It would seem based on this experience, reverting back to the one maximum video output resolution without an option, that does not seem to be the case.

Perhaps just another reason never to buy the first generation product.

nathan_h's picture

Phew, I guess the good news is that some high end cables work -- and that some vendors are admirably frank about the limits of their cables.

I wonder whether something like BlueJeans Cable is able to handle it at a lower price point.

David Vaughn's picture
When sourcing cables, I went by claims made on their websites. Blue Jeans stated that only their 5 meter cable was certified for 4K and I didn't see a longer option available.
nathan_h's picture

Whoops my mistake. You were testing only 10 meter (approx) length cables and BlueJeans doesn't have a 18gbps option at that length yet.

TheJoBoo's picture

Ever heard of DPL Labs? They offer the most stringent third party certification for 10.2 and 18 Gbps HDMI cables. (I have no financial affiliation with them.) Currently the ONLY cable for distance runs of 18 Gbps is Tributaries' Aurora OM3-based optical solution. HDBaseT chipsets are solely based on a 10.2 Gbps architecture. Due to the costs involved with developing a next generation version, we are unlikely to see an 18 Gbps HDBaseT chip in a VERY long time, if ever. The dirt cheap, buy them by the container with no quality control and don't bother to certify them HDMI cable companies like Monoprice will NEVER be the answer to such high demands on quality and data throughput. If you want real UHD/4K, you have to step up. This is not an "everybody gets one" situation. Don't you want to feel special? Don't you want to be the only kid on the block with that cool new toy? Buck up and pay for it.

David Vaughn's picture
I agree that optical is the best solution, but it's very expensive. Audio quest was up front stating that their cables may not pass the test, but they did (except for the Forest 10M cable), and I think it goes to show that the added silver does make a difference. As you pass 8 meters, your costs are going to go up substantially.
countrybread's picture

This is useful information but meanwhile there's 4K discs to be bought and very little mentioned about them. They don't even have a demo set up yet at my local Best Buy. Definitely needs more time in the oven.

David Vaughn's picture
We have a big feature coming up soon in the print magazine.
mikem's picture

In over 55 years of A/V exposure nothing is worse than HDMI. If I had a penny for every time the hdmi cable became disconnected, froze up, or the inevitable 'handshaking' gremlin reared its ugly head I'd be a millionaire. What a piece of crap technology.

hbomb7's picture

I totally agree that HDMI is useless. I know it's too late but it would have been nice if HDMI was researched/studied a little bit closer because I believe that it would not have been given the green light for public consumption. What a big mistake by the A/V industry!!!!!

specialk's picture

Since your projector is not true 4K, I wonder if a native 4k projector from Sony or a 4k TV would help.

Since I assume you don't have a 4K Sony projector laying around, did you test any of this when hooked up to a 4k TV?

David Vaughn's picture
It has nothing to do with the display not being native 4K...it accepts a 4K signal, it just uses e-Shift for the resolution. HDR and WCG are both supported. Also, if it was the case of not being native 4K, then NO cable should work or ALL of them should work. Not a mix of cables working and not working. It's a cable bandwidth issue.
Rodolfo's picture

My Sony 4K VW1100 projector works well with an Audioquest Cinnamon 10-meter and this player, but I added another 2 meter hi-speed to it for a better rack position and it stops working.

The Sony 4K player has been working with the same installation for almost 4 years, although is 8-bit 4:2:0, and the 2-meter extension does not affect it, but if I put an HDMI switcher to manage the 4K sources, even the Sony player does not work.

However. There is no need for all consumer content that is tipically stored or streamed as 4:2:0 to be up scaled for purposes of HDMI local delivery to the display device.

All consumer 4K and non 4K content has been (and on the near future will be) the same way: 4:2:0.

Upscaling to a color sub sampling of 4:4:4 or even 4:2:2 is a waste when the display has been doing and will do that for years, in doing so the player is misusing HDMI bandwidth that is needed for the 10-bit 4K Blu-ray.

Thanks for the article David, and for the experiment, the picture quality of the 4K Blu-ray Disc is excellent, streaming 4K is a mix bag depending of camera, source, ability of the cameraman, and of course the butchering compression applied by most services, which affects also the 4K content I download from Sony's Entertainment Service.

Best Regards,

Rodolfo La Maestra

MMK's picture

The one advantage of HDMI that was noted in the article is the lack of clutter, which also means ease of hookup for people who aren't in the least technical. A few years ago I was in Australia, and would be for the next 8 weeks, and my wife e-mailed me from home in the US saying that our AVR had died, and she couldn't watch TV anymore. Because everything was HDMI I was able to talk my daughter through the changes to cabling, which I would never have been able to do with DVI/RCA/optical.
Still, HDMI could be much better.

dmiller68's picture

I'm a little confused with this article I have had no issues with my HDMI cables. I run this cable with zero issues. Strait from the player to my TV. Which I believe this is one of the cables he had listed as a fail. Is the real issue the projector or something else causing his issue. http://www.amazon.com/Monoprice-Cabernet-Certified-Supports-Ethernet/dp/...

David Vaughn's picture
You may have a cable that works, in my case it didn't. It could be due to quality control on the cables. Also, what signal is the Samsung putting out? In my case, the JVC via EDID information is asking for 4K/60 4:4:4, which is a bandwidth hog...in your case, it may be asking for a lower resolution depending on your display. Recently JVC updated their firmware and the Monoprice cable now works--at 1080P only! The splashscreen comes up at 1080P, but when you put a 4K movie in the signal is locked at 1080P too! Switch to one of the Audioquest cables, it syncs at 4K60 4:4:4 for the splashscreen and the 4K movies play at 2160P/24 as they should. I surmise that with the Cabernet cable, a speedtest is done in the EDID handshake and it locks in at 1080P and won't come out of that mode, no matter what.
dmiller68's picture

Maybe it is because the JVC projector is not a true 4K projector? It is using shift not true 4K chips. I'm using a Samsung 85" 4K TV (UN85HU8550) + Samsung Evolution Kit (SEK-3500U/ZA) which gives me 4K plus HDR10. Just a thought. Anyway good luck I just think this could be miss-information saying the cable is at fault.

David Vaughn's picture
This has absolutely nothing to do with native 4K or not. It has to do with the length of cable and the EDID information that is being exchanged between the player, pre/pro, and projector. Nothing more, nothing less. As for your particular setup, how long of cable are you using? Also, this is not misinformation in any way shape or form. Check out the Samsung player thread at AVS...there are plenty of people having issues with multiple displays, not just JVC projectors.
dmiller68's picture

I posted it in my first post. It is a Monoprice Cabernet Ultra Certified High Speed Active HDMI Cable 30ft Supports Ethernet 3D Audio Return and CL2 Rated. However, I also had no issues running it over my older Monoprice non-Cabernet 30ft 10GB (my guess since I bought it in 2011). Your the first person I have seen blame their 4k issues on cables. Normally it is a non-HDCP 2.2 somewhere in the path. I'm active on several Facebook home theater sites and home theater forums and haven't seen any complaints of these issues. Since I'm normally the one with the issues my just over a year old receiver didn't support HDCP 2.2 so I had to trade it in for one that did. I have now switched all my cables over to 18GB rated ones to be safe but for years all "you" experts have been telling us that the cables and ratings don't mater... Now all the sudden they do? Heck I know several with your projector and the Samsung player running the Amazon Basic cables with no issues. Good luck with your articles but it is articles like this based on little real world data that are scaring people away from 4K. If you want me to take your article seriously interview a few 1000 people and get real data.

David Vaughn's picture
All of my equipment is HDCP 2.2 and HDMI 2.0a compliant. Explain to me how one cable works and others don't? That excludes the equipment and the only variable is the cable. HD Fury tests cables to sell and certify at 18 Gbps...their failure rate is about 80%--so only 2 out of every 10 cables work. The longer you go in your cables, the more problems you are going to have. I have some cheap 6 foot Monoprice cables and every one of them has worked with the Samsung player to the JVC projector as well as through my pre/pro, as long as a high quality longer cable was going from the pre/pro to the projector. Again, it's not the equipment, it's the cables. Have a nice day.
David Vaughn's picture
Read this thread over at AVS (budget a lot of time...it's quite long) and you'll see a plethora of posts of people having HDMI issues (some even using the same type of cables...one works, one doesn't): http://www.avsforum.com/forum/149-blu-ray-players/2325089-official-samsung-ubd-k8500-4k-hdr-ultra-hd-blu-ray-player-owner-s-thread-120.html#post43112722
mikem's picture

Since new technologies become as quickly obsolescent as new cars I have no intention of buying any 4K UHD BD player since that would me having to buy a new tv and receiver, and I'm just not ready to shell out that kind of money or other resources to do so. By the time my two year old Oppo, Piponeer Elite receiver, and panny plasma bite the dust I'll consider new components then. By that time perhaps we will be rid of hdmi. I know I'm dreaming but.... Regarding analog connections a friend of mine just did that and got rid of all hdmi connections. He also has a 12K HT system. His one reason for doing so was not because of the poor hdmi connectors, he just could not stand the whole hdmi 'handshaking' crap. I am sorely tempted. I was watching the Masters yesterday and every time I switched channels I had to wait upwards of 6-8 seconds.

WildGuy's picture

shouldn't 18 Gbps HDMI be able to handle 3840 x 2160p x 60fps x 10 bit 4:4:4 Y:Cb:Cr source?

Because 3840 x 2160 x 60 x 10 x 3 = 14,929,920,000 bits or roughly 15 gbps and HDMI 2 can support transfer up to 18 gbps.

the 10 x 3 means 10 bit per channel time 3 channels of Y:Cb:Cr btw.

so how come when an HDMI detect a 4K source @ 60fps at full 4:4:4 Y:Cb:Cr sampling, its limit it to 8 bit per channel?

David Vaughn's picture
2160P/60 4:4:4 is not in the spec. Venture over to here to see what's included: http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx
dnoonie's picture

Have you considered and EDID manager?

Does your Pre-pro have built in EDID management?

There are EDID managers that you can dial in to whatever resolution you want to have or want to fake. They are very handy and can improve setup time between cycles. I've used very simple and inexpensive ones that look like a female to female connector as well as larger little black boxes that are powered, they've all worked fine. Sorry I don't have any brand recommendations.

Good luck!

Cheers,

David Vaughn's picture
I believe the HD Fury does this...it can set the EDID information to "trick" the source to output the correct format. Some guys over at AVS are using these to try and strip the HDR information since the projectors aren't handling it properly (although I've found my own workaround). Also, the really large screen guys don't have enough light output to utilize HDR properly...in my case, my screen is only 76.5" wide, so HDR looks great on the "postage stamp" 88" diagonal screen.
dnoonie's picture

Hi,

Understanding EDID wasn't as necessary before devices tried to do too much for you, what a pain.

I'm not sure if you're interested or even need the info but, here are some resources on understanding EDID
http://www.analogway.com/en/products/software-and-tools/aw-edid-editor/#...

Click on the Download tab, there's a White Paper. And actually even through you might not want or need to program your own EDID running and understanding the EDID editor gave me a better understanding of EDID. The EDID editor is not product specific and can be used to edit and create EDID for any device.

Cheers,

David Vaughn's picture
Thanks for the link...I'll be sure to check it out.
Rodolfo's picture

David,

Rather than fighting a battle with Samsung or waiting your luck for the next 4K Blu-ray player, crossing the fingers that 2160/60p is output as 4:2:0 and not irrationally upscaled to 4:4:4 as you said the Samsung does, you may want to consider installing HDFury Integral, which will bring the 4:4:4 2160/60p high bandwidth output of the Samsung player to its original 4:2:0 state so all your equipment, HDMI chipsets, and cables investment can be reused.

This unit is a 2x2 HDMI 2.0a matrix switcher/splitter that also converts HDCP 2.2 to legacy HDCP to allow existing non-HDCP 2.2 AV Receivers to be on the 4K HDCP 2.2 signal path, which greatly simplifies the connectivity hurdles 4K brings to most legacy HT systems.

I just ordered a unit to review it on my magazine (and test it with my two 4K players).

https://www.hdfury.com/shop/splitter...60-444-600mhz/

Good luck David,

Rodolfo La Maestra

Raphael's picture

I understand this is an old thread, but here's my comments anyways...
My TV is a regular HDTV (1080P, Sharp Aquos Quattron). I've never had any issues with HDMI at all for the 6 years I’ve had my system set up. I have a Yamaha AVReceiver and all is connected via HDMI, of course, except the loudspeakers that use regular wires. I am now beginning to start the process of upgrading my living room AV setup. I bought a Jamo S810 subwoofer (delivered and installed already), and their set of speakers (5.0) that are supposed to arrive next week. The Yamaha AVR is quite old (10 years), and it already has one channel burned (I rewired and I use the B zone ports to bypass the burned channel in zone A). But, not only the AVR is old and with a burned channel but also it does not support all the good stuff that nowadays we have (eARC, Dolby Atmos, 4K, HDR, etc.) So, I am prepared to shell out another 600 dollars to upgrade the AVR. Now here's how my post is relevant to this thread: I am concerned that the HDMI cables I have, passed through (inside) the wall are not going to be able to support all these new technologies. The setup that I have is super neat! There are NO wires whatsoever dangling from my TV. The power outlet is behind the TV, and all cables go into a port also behind the TV, and come out lower in the same wall, connecting to the AVR. So, all cables are completely hidden behind the wall, the TV, and the cabinet where the AVR is. I am DREADING the day I find out that the HDMI cables are no longer good for all that is to come. I'm not one of those "handy" guys that can fumble around with cables inside walls and get things working. But, since I'll probably be out of other options, I will try and fish the cables by using the existing cables. I should be able to tie them tight enough so I don't lose a cable mid-way through the wall... :-/ Hopefully.. Wish me luck. Maybe I luck out and the old HDMI cables I have will be able to handle it.. P.S. Yes, the TV will be replaced with a 4K HDR soon as well, hence the potential need for new HDMI/higher rated HDMI cables..

Jackson143's picture

Not promising for those of us who use projectors that are often a good distance from the rack. | 360 photo booth Phoenix AZ

X