Thread: HDMI problem
View Single Post
  #3  
Old March 28th 18, 03:53 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default HDMI problem

Seymore4Head wrote:
I use two monitors. I have one monitor with HDMI and another monitor
that uses DVI. The HDMI is primary.

I was using the computer and both monitors went blank. I found the
problem. My dog chewed the plastic off the HDMI end next to the
computer.

I replaced the HDMI cable and neither monitor came on. I pulled out
the Graphics card.

I can plug the DVI into the mobo and it works. When I try to plug in
the HDMI the second monitor switches from primary to secondary, but
the HDMI monitor won't come on. I can unplug the HDMI and the other
monitor becomes primary again.

How do I get both monitors working again and then how do I tell if my
Graphics card is bad?


The HDMI cable has a +5V wire on it. If the +5V wire touches
the I/O pairs, it could damage either a transmitter pad
driver or a receiver pad.

You did the test with a new HDMI cable, the monitor didn't work.
That means something on the video card end is blown.

In a situation like this, you should do as much testing with
the "damaged" goods, to not endanger "new" goods quite yet.

Use the working outputs on some video source, and the
ports on the monitor, to assure yourself that the monitors
haven't been damaged in any way. You at least want to verify
the monitors, before jumping to any conclusions.

This means trying permutations and combinations using the
existing goods, to work out whether the video card has
a defective output, or the monitors have something blown.
Individual tests, one test case at a time, should suffice.
(I.e. an electrical failure should not be causing
desktop monitor order issues). Just make sure each
port works individually, as part of your damage
assessment. You can test the monitor ports, one at
a time, using the chipset/motherboard connectors.

*******

Onboard (so-called "chipset graphics), can have usage
limitations. The HDMI and DVI-D could actually be sharing
the same wire lanes, and the manufacturer "intended" the
user to only use one at a time. On a chipset DVI-I,
sometimes a manufacturer splits one of those and
makes a DVI-D connector and a VGA connector. The driver
may not support using both at once, depending on how
it views usage of the crossbar on the GPU output.

These details are not documented anywhere. The hardware
maker is "ashamed" to admit to such things. It is left
to people in forums to document "I can't run DVI and
VGA at the same time on this motherboard".

Newer gear doesn't have these limits. You can run three
monitors off the back of the motherboard.

You're going to need to datamine, using CPU/chipset/motherboard
details, to get some usage rules.

*******

Now, pop another video card in there, and move on.
Nobody really likes chipset graphics all that much, and
especially if driving multiple monitors.

One disadvantage of modern video cards, is they've lost
the VGA connector. You might find DVI, HDMI, DisplayPort,
but we appear to have passed the VGA era. In an emergency,
you can convert DisplayPort to VGA with an adapter,
so that remains as an option, but increases the
price of the "repair".

A low end card now, can be two slots wide, even though
they could have made it a bit thinner. You have both
a connector issue (not the right mix for your existing
monitors), as well as the thickness of the card perhaps
interfering with other stuff in the PC.

Good luck,
Paul
Ads