Updated October 11, 2012

HDMI.org

Right now you can buy a 6-foot-long HDMI cable for $3.50. Or $19.99. Or $99.99. Or $699.99. Salespeople, retailers, and especially cable manufacturers want you to believe that you'll get better picture and sound quality with a more expensive HDMI cable.

They're lying. You see, there's lots of money in cables. Your money.

Dozens of reputable and disreputable companies market HDMI cables, and many outright lie to consumers about the "advantages" of their product.

Worse, the profit potential of cables is so great, every retailer pushes high-end HDMI cables in the hopes of duping the buyer into spending tens, if not hundreds, of dollars more than necessary.

Here's the deal: expensive HDMI cables offer no difference in picture quality over cheap HDMI cables. CNET has mentioned this before, but here's the science of why.

The signal

The first thing to understand is what's transmitted over the cable in the first place. HDMI uses Transition Minimized Differential Signaling, or TMDS.

TMDS has two basic aspects. The first is that the ones and zeros at the source (a Blu-ray player or HD cable/satellite box) are not exactly the ones and zeros your TV uses to create a picture -- at least, not in exactly the same order. Before sending the signal out via the HDMI output, the ones and zeros are rearranged to minimize how many transitions there are. So instead of 10101010, the transmission may look like 11110000. If you really like math, how it does this is cool, but it's not really important to understanding the concept as a whole.

HDMI.org

Even though this conversion is weird, it makes it much more likely that the data transmitted can be rebuilt on the other end (as in, at the display).

The second part of TMDS (the DS part) is the HDMI cable itself. Each HDMI cable is actually multiple small, copper wires. Two versions of the data are sent over different wires. One of these is out of phase with the "real" signal. The TV receives all the data, puts the out-of-phase signal back in phase, then compares it to the "real" signal. Any noise picked up along the way will now be out of phase, and as such it is effectively negated and ignored.

If you're an audio person, this is similar to how balanced (XLR) cables work.

TMDS works really well, allowing for short cables and fairly long cables to carry what is a pretty intense amount of data. It also means you can have inexpensive cables that work just as well as expensive ones.

More important to our discussion, it means that when something goes wrong, it goes really wrong. It's often said that with an HDMI signal, you either get everything and it's perfect, or it isn't perfect and you get nothing. In fact, I've said this. If you're getting an image that looks correct, and there are no dropouts in the audio or video, then you're getting everything that's being sent. If the cable is faulty, or it's a really long run with an under-built cable, most of the time you'll just get nothing. No picture at all.

The question I've often gotten is what if you're right on that digital precipice? That teetering space between "everything's good" and "I got nothin.'"

I'm glad you asked.

Video

As you've read, the ones and zeros of an HD image trot happily along, more or less, from your source to your TV. Over short runs, there really isn't anything other than a faulty cable (which itself isn't that likely) that would cause any issue. Over long runs, it's possible that interference of some kind, or a poorly made cable (more on this later), can reduce the "quality" of the signal to the point where the TV can't make heads or tails of it. Heads or tails -- that's a digital joke.

At this point, you're on the edge of the digital precipice. The most likely outcome is sparkles. Here's what they look like:

Geoffrey Morrison/CNET

Geoffrey Morrison/CNET

Geoffrey Morrison/CNET

It looks a lot like snow, or static. The data received by the TV wasn't enough to figure out what those failed pixels are supposed to be. Your TV likes you, though, and it really wants to show you an image. So it builds the rest of the video, minus the failed pixels.

It's important to note that this artifact is pretty unlikely, even over long runs. You are way more likely to just not get anything at all.

If it's so unlikely, why do I bring it up? Because it's important to understand that it is impossible for the pixel to be different. It's either exactly what it's supposed to be, or it fails and looks like one of the images above. In order for one HDMI cable to have "better picture quality" than another, it would imply that the final result between the source and display could somehow be different. It's not possible. It's either everything that was sent, or full of very visible errors (sparkles). The image cannot have more noise, or less resolution, worse color, or any other picture-quality difference. The pixels can't change. They can either be there (perfect, yay!) or not (nothing, errors, boo!).

All the claims about differences in picture quality are remnants of the analog days, which were barely valid then and not at all valid now. There is no way for different cables to create a different color temperature, change the contrast ratio, or anything else picture-quality-wise.

At this point some of you are saying "but sparkles are noise." No, I consider sparkles an example of a signal failure and as such requires a new HDMI cable. If you see sparkles, you need a different cable.

Another potential "fail" is a failure of the HDCP copy protection, which shows up as a total snowy image, a blinking image, or something else hard to miss. This is actually even less likely, as the TMDS is more likely to fail than the channel HDCP requires for its handshake. I have seen this in my testing, though, so it's worth mentioning.

Audio

Several companies claim that their HDMI cables sound better than other HDMI cables. One in particular claims this is because there is no error correction on the audio and its cables are more likely to transmit all the data.

First of all, this is untrue. Audio over HDMI actually has more error correction than the video signal. But even if this weren't the case, it's still utter nonsense. Dolby has extensive error correction built into its codecs. In other words, if you are sending the Dolby Digital Plus, TrueHD, or whatever bitstream over HDMI from your Blu-ray player, the data going into the DAC in your receiver is bit-for-bit the same as what's on the disc. DTS presumably works in the same way, though the company ignored my repeated requests for info. Cheap or expensive, the cable is irrelevant when it comes to transmitting Dolby or DTS.

If the cable is faulty or if there is some cataclysm causing data to be lost between the player and the receiver, the decoders are designed to mute instead of blasting out compromised data. There is no such thing as an audio version of "sparkles." Instead, you just get a total dropout of the audio. So if you're getting audio dropouts, it's possible it's the HDMI cable. But if you're not getting video issues as well, the problem is likely elsewhere. If the audio isn't muting, then as long as you're outputting an audio codec, you're getting exactly what's on the disc.

If you're playing a CD on a Blu-ray player, the output is PCM to the receiver. This data is packetized, just like the rest of the audio and video signal. As such, it is error-corrected. However, jitter is far more likely than with an optical or coax connection. In discussions with several audio equipment manufacturers since the original publication of this article, I've been told by all of them that the DAC in the receiver is going to have a far greater effect on the sound than the jitter in the transmission. Before you leap on that, keep in mind that the DAC has a smaller effect on the sound than the amp, the speakers, and definitely less than the room itself.

Oh, and in case that wasn't clear, the jitter is inherent in the HDMI transmission itself. The cable isn't going to have any effect.

Likely transmission

The big "if" that I've been repeating is "if the signal gets there." Over short runs -- a few meters, say -- it is incredibly unlikely that even the cheapest HDMI cable won't work perfectly. Over longer runs, the answer is less clear-cut. The variables of the transmitter and receiver combo in the source and display, plus any repeaters you have in the mix (like a receiver), mean that not every long HDMI cable can handle all the data. By long, I mean 50 feet or more.

If you need to run long HDMI cables, it's a safe bet you're going to run it through a wall. If so, it is vital you test the cable with all your equipment before you install it. Plus, as tempting as it is to get the cheapest cable that will work in this case, just because a cable works with all your current gear, doesn't necessarily mean it will work with your future gear.

If you need a long HDMI cable, check out the tests I did over at HDGuru.com. I tried out several brands of 50-foot-plus HDMI cables, including Monoprice, Monster, and Straight Wire, and got some interesting results.

Now playing: Watch this: Three ways to wrap your cables





The cable lies

In the home, there are only four basic types of HDMI cables:



High-speed (also called Category 2)

High-speed (also called Category 2) with Ethernet

Standard-speed (Category 1)

Standard-speed (Category 1) with Ethernet

That's it. Standard-speed cables are rated to carry up to 1080i. Many standard-speed cables can probably handle 1080p, they're just not rated for it.

High-speed cables can do well beyond 1080p (up to 4K, so you don't need "4K HDMI cables"), including 3D. Check out my article on how 3D content works for more info on that.

Honestly, though, if you're buying the right kind of cables (i.e. as cheap as possible), there won't be enough of a price difference to justify not buying a High Speed cable. Any high-speed cable should work with 3D and Audio Return Channel (ARC).

When cable manufacturers claim their cables are "Made for 240 Hz" they are lying to you. The conversion to 120 or 240 Hz is done inside the TV. There is no such thing as a 120 Hz or 240 Hz signal. Blu-ray content is 1080p/24, though your player likely converts this to 1080p/60. This is the highest-bandwidth, non-computer source you can have, and even it is only 60 Hz (check out 1080i and 1080p are the same resolution and What is refresh rate? for more info).

More expensive cables can be more rugged, with thicker casings, a beefy connector, and higher potential durability. If and how much more this is worth is up to you. Personally I find the bulky plugs of many "high-end" HDMI cables to be a nuisance, either falling out, or pulling on the connector in such a way that could potentially pose problems in the long run.

Bottom line

OK, so not all HDMI cables are literally the same. Different manufacturing quality can have a slight affect on the ability to transmit the signal over long distances (50+ feet). Better made cables may even last longer. "Better made" doesn't have to mean more expensive.

No matter what, though, there is absolutely no picture or sound quality difference between a $3.50 cable and a $1,000 cable.

Most of you reading this only need a few feet of HDMI cabling to run from your Blu-ray player and cable/satellite box to your TV. Over these short distances, even the cheapest HDMI cables are going to work. And if they work, as you've read, it means you're getting perfect image and sound. Even over long runs, most cheap cables can do the job just fine. Don't let a salesman try to up-sell you on $300 HDMI cables as the "only way to make your new 240 Hz TV work." Politely tell him he is incorrect and to move on with the sale.

In the year and a half since we first published this article, the most common misunderstanding comes from those used to an analog cable mentality. They understand that over any cable, there is a high likelihood of signal degradation. As in, the signal received by the television isn't as strong or exactly the same as what leaves the source.

However, unlike analog cables, there is no linear correlation between signal degradation and picture degradation. The picture will be perfect up to the point where there's not enough signal to create the image. At that point, you'll have nothing. No picture at all. In the occasional situation where you get sparkles (as mentioned above), this is proof of that the system works (but the cable doesn't). You can't change what the pixel is. It can only be exactly the right pixel as sent by the source, or no pixel at all.

So my original conclusion is still apt: If you're paying more than $5 for a 2-meter HDMI cable, you're overpaying.

Continue on to Why all HDMI cables are the same, part 2, Still more reasons why all HDMI cable are the same, the HDMI Cable Buying Guide, and 4K HDMI Cables are nonsense (yeah, there's a lot to cover).

You don't have to take my word for it:

HDMI.org

EETimes.com, "HDMI: The digital display link"

HowStuffWorks, "How HDMI works"

Wikipedia HDMI entry

Wikipedia TMDS entry

Wikipedia 8b/10b encoding entry

HDGuru.com, "All HDMI cables are the same! Or are they -- full test"



Got a question for Geoff? First, check out all the other articles he's written on topics like LED LCD vs. plasma, Active vs Passive 3D, and more. Still have a question? Send him an e-mail! He won't tell you which TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.