Thread: HDMI problem
View Single Post
  #12  
Old April 7th 18, 09:54 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default HDMI problem

Wolf K wrote:
On 2018-04-07 11:42, Seymore4Head wrote:
[...]
The problem turned out to be the 25 ft HDMI cable. I have used the
cable before to connect my laptop to my 50 inch TV, but I quit using
it when I bought the WDTV player.

I tried using the HDMI cable with the WDTV player to just test it, and
I could see the opening screen but it would the picture would flash in
and out.
I bought a 10 ft cable and all is well.

I sure am glad my video card did not get fried.

Thanks everyone


Good to know. Thanks for reporting back. Sounds like at 25ft the
impedance was just too much for the video card to handle.


The parameter you want, is likely to be "loss" or "dielectric loss".

The AC impedance is actually constant, whether the cable is one foot
long or twenty five feet long. HDMI cable is 100 ohms
differential on R,G,B,CLK. A resistor on each of those
signal pairs in the monitor, terminates the cable in
its characteristic impedance of 100 ohms. In fact, the
video card detects it should output a signal on the
cable, by noticing a 100 ohm load on each pair.

The launch amplitude might be on the order of 1V,
just making up a value.

The "loss" in the cable, might cause the amplitude at
25 feet, to be only 25mV. And that's sort of the
sensitivity limit at the receiver. The signal
is too weak then, to be cleanly thresholded
and turned into ones and zeros. Either the
picture has colored snow (if slightly degraded),
or eventually the synchronization information
is lost, and the monitor switches to a black screen.

Now, the really funny part, is HDMI cables come in
different material compositions. They might have to
meet some minimum level of performance. But in
some cases, the dielectric loss of one cable
might be significantly better than another cable.
If you bought ten 25 foot cables and compared, maybe
one or two of them actually worked at that distance.

And we're not talking Monster cables here either.
Spritzing 10u of gold plating on the connectors
on either end, doesn't convert a cable into a "hero".
But a change of "plastic" dielectric in the cable,
could make a difference (polystyrene or teflon).
Some of the dimensions of things inside the cable
might have to change, if the dielectric is modified,
so such a change is not without some visible side
effects. Cutting samples of good and bad cables,
and looking at a cross-section, might reveal differences.

By comparison, as far as I know, all SATA cables are
made the same way. But things like HDMI, who knows
what they stuffed in there.

All that the cable has to do, is present the signals
with enough amplitude and good enough group delay
that the digital signal can be recovered. If the
cable is made "even better", there's no difference
in the picture or sound. Once all bits are received
error free, getting 100mV of amplitude instead of
50mV, doesn't "make the picture brighter or the
sound sparkle". This is digital transmission, not
analog, so the degradation tends to be "make or break",
rather than "fuzzy or soft".

https://www.cnet.com/news/why-all-hd...-are-the-same/

Some actual tests.

https://web.archive.org/web/20110525...ull-test/4373/

*******

The VGA era was different, in that as the cable length
went up, we would adjust the resolution down to compensate.
At zero feet, maybe the setup supported 1600x1200, while at
100 feet, a setting of 1024x768 is usable, and 1600 would
be out of the question (too fuzzy). The degradation was
a kind of "continuous variable" with VGA. Not the abrupt
"make or break" of HDMI.

It's possible for some effects like that with HDMI, but
the user isn't always at liberty to switch down to 640x480
in order to make a long cable work. Maybe an output device,
only outputs HD 1920x1080 (say, a DVD player).

*******

There may be various proposals to make "smart cables", and
have the destination end send a signal back to the driving
end, to adjust the line build-out. But thinking along these
lines works best, if the idea is standardized so all equipment
supports the idea.

https://patents.google.com/patent/US9042437

For a price, you can probably send an HDMI signal with a
fiber optic cable. And then the reach might be kilometers.
But the simple wire cable is cheaper.

You can also send HDMI wirelessly, at 60GHz. As long as
the transmitter and receiver are in the same room, in
line of sight, and preferably less than 25 feet apart.
The fiber optic transmitter might be cheaper :-)

Paul
Ads