Tim Slattery wrote:
4K 3840 x 2160 === from a spec web page
Hi-DPI (it's 2X of an HD monitor at 1920x1080).
So you're driving it with the wrong video card and it's
not running native for starters.
It is at its native resolution, no doubt of that.
Have a look through Wikipedia in the HDMI and DisplayPort
sections for details. VGA can't go that high, and VGA
typically stops at 2560 or so (the VGA spec is unlimited,
but DACs don't go that high). The dual link DVI won't
go that high either.
DVI dual link 2560x1600 at 60Hz
We're using a DVI connection. The resolution is the full 3840 x 2160
What's your refresh rate ?
Go back to the Windows Display panel and check.
A 4K panel might support 24Hz (not because the LCD
panel works at 24Hz, but because it can fake it digitally).
I don't recollect anyone mentioning refresh rates below that.
Even at 24Hz, I'm not sure the reduction in clock rates,
make this possibility a reality. Maybe someone in
avforums.com knows, as I'm not up on the hardware
details at all. If the specs were open distribution,
I might have collected and read some of this.
This is dual link DVI
R 1650mbit/sec \
G 1650mbit/sec \
B 1650mbit/sec \
CLK 165MHz \___ 2560x1600 at 60Hz CRT-RB reduced blanking
R 1650mbit/sec / One clock cycle equals 10 bits, 8B10B code,
G 1650mbit/sec / 8 bit per pixel output from decoder.
B 1650mbit/sec / 24 bit total on three guns.
CLK 165MHz / Clock equals pixel rate.
Even if you plugged a (single link) HDMI into
a DVI adapter, it's going to look like this.
I got the 12.54 number from Wikipedia.
R 12540 mbit/sec \
G 12540 mbit/sec \
B 12540 mbit/sec \
CLK 313.5MHz \___ 3840x2160 at 60Hz CRT-RB reduced blanking
/ 12.54 Gbit/s per channel.
R - / When clock rate goes over 340MHz, the
G - / clock switches to 40 bit mode and four
B - / symbols are sent per clock. And the clock
CLK - / is still well above the DVI limit.
Is the DVI block inside the monitor going to be expecting
the 40 bit mode ? Just look at the numbers involved.
Think about the loss in the DVI cable at that baud rate,
and how small the receive amplitude is going to be at
the DVI receiver.
While they could stuff an HDMI PHY on the DVI port,
there will still be issues with the "front end"
gear grinding... that make me suspicious. At those
kinds of data rates, you'd expect equalization of
some sort. Would the DVI block normally need
equalization ? Probably not. But they could
use an HDMI 2.0+ PHY and "fake the rest of it"
with software modes. Who can really guess what
"outside the spec" behaviors exist ?
If you gave me more details, like the video card
involved, maybe it doesn't even have HDMI 2.0
and native is definitely impossible ? That would
save me having to cook up technical reasons :-)