Main > Everything Else

Great HDMI cable

<< < (12/15) > >>

Vigo:

--- Quote from: shateredsoul on March 11, 2011, 03:58:23 pm ---maybe there is a difference, but I guess the point is that most ppl wouldn't really be able to tell the difference. In all my visits to best buy and other stores, only once did I feel the quality of the tv changed the experience.. but not sure if for the better. They were showing the fist pirates of the Caribbean and you could see all the make up, the hairs and stuff on their face, you could even tell which of the props were fake (i.e. fake wood).

--- End quote ---

The Fist Pirates of the Caribbean? Is that some gay porno movie Best Buy was showing you?  :lol


You bring up a good point about things looking too clear sometimes. I remember a setup that they had at Best Buy where they were trying to sell BluRay and had a comparison between the BluRay quality and DVD quality of a battle scene. The BluRay shot you could see every little person in perfect detail in the background, as clear the main characters fighting each other in the foreground. I ended up preferring the DVD grade because I didn't know what I was suppose to focus on in the BluRay shot and ended up getting a bit dizzy from all that was going on. It looked pretty, but I lost some enjoyment trying to catch every detail.

MonMotha:
HDMI cable quality is a big deal.  You're talking 4 signals which have to be synchronized to within a fraction of a bit period running at potentially a GHz or so for 1080p60.  Double that for 3D.  That is surprisingly hard to transmit over a wire of any appreciable length (more than a few inches).  The quality of the cable will affect all sorts of things and could result in bit errors or clocking errors.  The effects of bad cabling also get worse on longer runs.  For a typical 6-10ft home A/V center run, the cheap $3-5 cables are generally fine.  For a 100ft run, you probably need to spend more than 10x what the short one costs.

However, the nature of the problem is such that, in general, while the possible error is small, any given bit is just as likely to be affected as any other bit since each bit is sent one after another (this is simplification as TMDS symbols are actually sent, but that doesn't really change things much).  An error in the MSB will cause a majorly wrong pixel color (changing it by half its total possible value) while an error in the LSB will likely be imperceptible except on a deliberately rigged test image (e.g. a perfect gradient).  Clocking issues will be immediately apparent as image or line jitter.

In other words, if it looks right and isn't "snowy", it is right; your $5 cable is sufficient (and they always seem to be for me).  In fact, the HDMI guys won't let you sell the cable with their connector (it's patented) unless it passes the relevant tests.  I'm sure that not every cheap fly-by-night cable maker in China abides by this regulation, but I suspect most do since the penalties amount to "your products can no longer be imported into the USA".  The tests include quite a bit of margin from the spec, and most receivers beat the requirements of the spec, anyway (they're more tolerant than they need to be)

This is contrasted with analog transmission methods: with analog transmission methods (component YPbPr video, RGB, S-Video, Composite), errors, which tend to be small, will always have a small effect on the received video.  This means that you may not notice, but you have a "suboptimal" picture.  Honestly, for short runs on reasonable cable, the error introduced by the ADC and its power supplies in your TV is probably worse, but inordinately bad cable is unsuitable for anything above 480i (and barely suitable for that).

A few notes to correct some incorrect information that's shown up in this thread:
HDMI does not include error checking or correction (though some errors can be detected in a non-robust way as an invalid TMDS bitstream).
HDMI does not include retransmission of bad data: it doesn't know it's bad in the first place, and there's no real reverse channel to request a retransmission, anyway.
The bitrate on HDMI is based on the resolution you run it at (unlike DisplayPort or SDI), and frames are not really packetized beyond the SAV/EAV framing that's present for blanking/sync information, anyway.  Audio is inserted during the blanking period.

And yes, I am a EE (well, Computer - similar qualifications).

shmokes:
Heh, that makes sense.  Thanks.  Btw, if there's no reverse channel, how do various devices turn each other on and off?

ChadTower:

--- Quote from: shmokes on March 14, 2011, 12:29:39 pm ---Heh, that makes sense.  Thanks.  Btw, if there's no reverse channel, how do various devices turn each other on and off?

--- End quote ---


Stuff like that is the crappy side of HDMI.  When I got the only TV I have with HDMI I tried to set it up with a DirecTV DVR.  HDMI, brand new DVR, one cable, woo!  Except not woo.  Every other time I used the remote's "all on" macro button I would get no sound on the TV.  I spent an hour debugging, an hour on the phone with DirecTV, until finally I just switched to component/AV audio.  The problem never occurred again.  Searching AVSForum turned up many places where people can't use macros on their remote with HDMI but they can with other wiring types.  I didn't mind going with AV audio because I don't have a receiver for that TV but if I did I'd still have to turn everything on separately and in a specific order.

MonMotha:
There's the "CEC" ("consumer electronic control") channel allows the various interconnected devices to turn themselves on and off, set inputs, etc.  It's a single wire bus that predates HDMI.  It's known by the name "Anylink", "VieraLink", and similar names by various vendors.  Its operation is 100% independent of the video signals.

There is also the DDC channel.  This is the VESA comm channel also used by DVI and VGA connections to identify the monitor's capabilities.  It can in theory be run in both directions, but in practice, only the source sends data to the monitor.  DDC operation is also 100% independent of video signals.

Neither channel is fast enough for any sort of acknowledgement/retry system, even if you wanted to use it for such a thing.  DDC is run at 100kHz (optionally 400kHz) and has substantial overhead resulting in a usable bit rate of typically <75kbps.  The CEC channel is even slower, IIRC.

The DisplayPort auxiliary channel is quite a bit faster, but there's still no ack/retry on DisplayPort, at least not that I remember from reading the spec.

Oh, and if the HDMI CEC stuff bothers you (it bothered me when my TV randomly turned on in the middle of the night), many devices will let you disable it.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version