For HDMI in particular, there's no error correction at all on the video data (though there is on the audio and control data).
So if one bit gets flipped in the video data as it goes through the cable, you're going to see that wrong bit on screen. You'd have to have an insanely high bit error rate for your eyes to notice it, but it's there, and it gradually gets worse as the error rate increases.
What could flip a bit in a cable? Any number of things. EM interference from outside a badly shielded cable. EM crosstalk between conductors inside a badly designed cable. Signal reflections at badly designed connectors. Basically anything that can happen to any other cable, whether the signal that goes over it is considered to be analog or digital.
I assume the reason HDMI applies EC to the audio and control data is that errors in those channels are more noticeable to the user, whereas errors at reasonable rates in the video will mostly sneak by.
I've had this happen. You see what are called "sparkles" when you get a lot of failures (and it truly looks like sparkling glitter on your TV). It turned out to be the TV's connector more than the cable itself. The same $4 cable worked fine on a replacement set.
Totally true. HDMI takes many steps to try to reduce errors to a reasonable rate. In practice, this rate is usually low enough to ignore, even with cheap cables.
I was just making the point that there is indeed a progressive degradation in image quality with increasing error rate, even in HDMI.
IMHO, seeing sparkles on the screen means 'does not work.' Hearing audio errors means does not work. When pushing a digital signal around if you have significant errors you'll know quickly at the receivers end if the cable is bad. If you hook up your cheap cable and don't notice any of these errors, getting a better cable isn't going to make the picture sharper or sound clearer like in the days of pushing analog signals around.
For HDMI in particular, there's no error correction at all on the video data (though there is on the audio and control data).
So if one bit gets flipped in the video data as it goes through the cable, you're going to see that wrong bit on screen. You'd have to have an insanely high bit error rate for your eyes to notice it, but it's there, and it gradually gets worse as the error rate increases.
What could flip a bit in a cable? Any number of things. EM interference from outside a badly shielded cable. EM crosstalk between conductors inside a badly designed cable. Signal reflections at badly designed connectors. Basically anything that can happen to any other cable, whether the signal that goes over it is considered to be analog or digital.
I assume the reason HDMI applies EC to the audio and control data is that errors in those channels are more noticeable to the user, whereas errors at reasonable rates in the video will mostly sneak by.
The HDMI spec at http://www.hdmi.org/download/HDMI_Spec_1.3_GM1.pdf is a great source of information about this.