HDMI cable quality is a big deal. You're talking 4 signals which have to be synchronized to within a fraction of a bit period running at potentially a GHz or so for 1080p60. Double that for 3D. That is surprisingly hard to transmit over a wire of any appreciable length (more than a few inches). The quality of the cable will affect all sorts of things and could result in bit errors or clocking errors. The effects of bad cabling also get worse on longer runs. For a typical 6-10ft home A/V center run, the cheap $3-5 cables are generally fine. For a 100ft run, you probably need to spend more than 10x what the short one costs.
However, the nature of the problem is such that, in general, while the possible error is small, any given bit is just as likely to be affected as any other bit since each bit is sent one after another (this is simplification as TMDS symbols are actually sent, but that doesn't really change things much). An error in the MSB will cause a majorly wrong pixel color (changing it by half its total possible value) while an error in the LSB will likely be imperceptible except on a deliberately rigged test image (e.g. a perfect gradient). Clocking issues will be immediately apparent as image or line jitter.
In other words, if it looks right and isn't "snowy", it is right; your $5 cable is sufficient (and they always seem to be for me). In fact, the HDMI guys won't let you sell the cable with their connector (it's patented) unless it passes the relevant tests. I'm sure that not every cheap fly-by-night cable maker in China abides by this regulation, but I suspect most do since the penalties amount to "your products can no longer be imported into the USA". The tests include quite a bit of margin from the spec, and most receivers beat the requirements of the spec, anyway (they're more tolerant than they need to be)
This is contrasted with analog transmission methods: with analog transmission methods (component YPbPr video, RGB, S-Video, Composite), errors, which tend to be small, will always have a small effect on the received video. This means that you may not notice, but you have a "suboptimal" picture. Honestly, for short runs on reasonable cable, the error introduced by the ADC and its power supplies in your TV is probably worse, but inordinately bad cable is unsuitable for anything above 480i (and barely suitable for that).
A few notes to correct some incorrect information that's shown up in this thread:
HDMI does not include error checking or correction (though some errors can be detected in a non-robust way as an invalid TMDS bitstream).
HDMI does not include retransmission of bad data: it doesn't know it's bad in the first place, and there's no real reverse channel to request a retransmission, anyway.
The bitrate on HDMI is based on the resolution you run it at (unlike DisplayPort or SDI), and frames are not really packetized beyond the SAV/EAV framing that's present for blanking/sync information, anyway. Audio is inserted during the blanking period.
And yes, I am a EE (well, Computer - similar qualifications).