View Single Post
  #15  
Old March 20th 19, 09:43 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Blue, white, and HDMI cables for a 23-inch diagonal monitor

Jean Fredette wrote:
Paul posted:

You can then decide, which cable is cheaper, between DVI and HDMI,
as electrically they could work with a single lane.


If that's the choice then I'll get HDMI since I might need it in the
future. The price of the cable isn't an issue I care about. I just want to
choose the right cable type since the GeForce 210 and Flatron E2341 have 3
choices.

It's probably a Geforce 210.


You are right. I must have transposed the letters.
I checked GPU-Z again where it's a GeForce 210 as you said.

The two digital standards tend
to keep their quality (with short cable, good quality cable).


I was thinking six feet or maybe ten feet as four is too short I think.

The desktop will sit on the floor to the side of the monitor where the
cable has to snake around the desk a bit to get to the monitor down below.

The review here seems to be claiming it is HDMI 1.3a, and has
8 channel LPCM audio over HDMI capability.


I think what you're saying is that the hdmi will carry the audio to the
headphone jack of the LG Flatron E2341 monitor but the DVI will not?

With that out of the way, you can use either the DVI or HDMI.


Thank you for that research where both will work.
Did I correctly read you that the DVI does not carry the audio?

I think from what you wrote, I'll buy a 6 or 10 foot hdmi cable where the
main difference is only that the hdmi carries audio?

My main confusion is that I think you said the dvi has slightly better
resolution under some circumstances? But I am just using it for normal
things where the basic good resolution should be ok for me.


1) All ports have roughly the same resolution choices,
give or take a bit. At least in the current situation
all are theoretically better than is needed for the
1920x1080 application at 60Hz. I'd have to be more careful
shooting from the hip, if your monitor was 144Hz (gamer monitor).

We don't like to push VGA too far, because the cabling is the
issue with VGA. The connector design isn't suited to "high frequency
signaling". So just picking a figure out of the air, I suggest
that maybe 1600x1200 is the point at which the digital ones
might start looking better, and VGA is running out of steam.
At 1024x768, you likely couldn't tell the difference between
VGA and HDMI.

2) HDMI appears to have audio in this case. But for the time,
this might have been the first generation of low end card
with the audio integrated. A previous generation used SPDIF
passthru, with the HDMI standard unaltered and having "slots"
for 8 channel audio.

3) I can see reports of "funny things happening" with the audio
over DVI. It appears it can work.

https://forums.tomsguide.com/threads...gh-dvi.233582/

The hard part, would be digging up a technical backing for it.

The "swapping connectors thing" started before audio carriage
existed.

https://en.wikipedia.org/wiki/Digital_Visual_Interface

"For example, an HDMI display can be driven by a DVI-D source
because HDMI and DVI-D both define an overlapping minimum set
of supported resolutions and frame buffer formats.

Some DVI-D sources use non-standard extensions to output HDMI
signals including audio (e.g. ATI 3000-series and
NVIDIA GTX 200-series).[9] Some multimedia displays use a
DVI to HDMI adapter to input the HDMI signal with audio.
Exact capabilities vary by video card specifications."

To me, where this might "break", is if the monitor was
2560x1600 and used dual-link DVI, I doubt the audio would
work over that, because the "overlap of standards" no longer
works when dual lanes are needed on DVI carriage. But in
your case, a 165MHz clock on a single TMDS (transition minimized
differential signaling) interface, means "easy swapping" via passive
connector conversion. There's sufficient overlap of standards
at 1920x1080 for this to be possible. The monitor also plays a
part, if say, the designer chose to be picky and "only supported
legacy (no audio) data extraction" on the DVI.

If you had an Apple 30" Cinema display with speakers, this
would be a much more iffy proposition, because you'd need two
lanes on the DVI to work.

But the above information suggests "it could happen" at 1920x1080.

The Wikipedia article refers to EDID as well, and both HDMI and
DVI have EDID (monitor declares what it supports), so that
should work on both of them. HDMI has additional functions
with the extra wires it's got, such as CEC for switching
equipment off. On a TV set, if you hit the power button,
using CEC the BluRay player would power down, because
the TV would tell the BluRay player "we won't be needing
you now". That channel doesn't exist on DVI, so a DVI to
HDMI adapter would have no signal to drive that pin. There
might be a similar issue with audio return channel or something,
which isn't an issue in this case. Audio return channel is
more of a home theater issue. These are examples of pins
not driven, when a DVI to HDMI is used.

HDMI

Pin 13 CEC (Consumer Electronics Control extensions, power down)
Pin 14

Reserved (HDMI 1.0–1.3a)
Utility/HEAC+ (HDMI 1.4+, optional, HDMI Ethernet Channel
and Audio Return Channel)

Paul


Ads