A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Microsoft Windows XP » General XP issues or comments
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Windows XP Font Rendering Issue



 
 
Thread Tools Display Modes
  #1  
Old July 24th 16, 10:05 AM posted to microsoft.public.windowsxp.general
gargoyle60
external usenet poster
 
Posts: 69
Default Windows XP Font Rendering Issue

Need some help with my desktop computer...

Previously I have bee using an AOC Razor Monitor e2343F (23") monitor attached via DVI-D (single
link) cable and have always had a clear monitor image with sharp edges.

I have just swapped to a larger screen, an LCD 32" TV (LG 32LD450). This doesn't have a DVI socket
so I am using good old VGA. Graphics and colour appear crystal clear but I have noticed that
rendering of fonts (or text generally) is not so clear under Windows XP (but fine under Linux). I
have reinstalled the display driver but this hasn't cleared up he font rendering issue. There are no
reported font problems as far as I can tell.

Any suggestions please?

The graphics card is an ATI Saphire Radeon X300.
Screen dimensions/resolution settings: Windows:1360x768(16:9), Lunux:1920x1080(16:9).
Incidentally, I did purchase a full DVI-I (dual link) to HDMI converter cable (the TV has 2 HDMI
sockets) but the TV didn't perform too well and I got a lot of screen size distortions and very
limited available resolution settings under both Windows and Linux, so hence my now using a VGA
cable.
Ads
  #2  
Old July 24th 16, 10:55 AM posted to microsoft.public.windowsxp.general
Mike S[_4_]
external usenet poster
 
Posts: 496
Default Windows XP Font Rendering Issue

On 7/24/2016 2:05 AM, gargoyle60 wrote:
Need some help with my desktop computer...

Previously I have bee using an AOC Razor Monitor e2343F (23") monitor attached via DVI-D (single
link) cable and have always had a clear monitor image with sharp edges.

I have just swapped to a larger screen, an LCD 32" TV (LG 32LD450). This doesn't have a DVI socket
so I am using good old VGA. Graphics and colour appear crystal clear but I have noticed that
rendering of fonts (or text generally) is not so clear under Windows XP (but fine under Linux). I
have reinstalled the display driver but this hasn't cleared up he font rendering issue. There are no
reported font problems as far as I can tell.

Any suggestions please?

The graphics card is an ATI Saphire Radeon X300.
Screen dimensions/resolution settings: Windows:1360x768(16:9), Lunux:1920x1080(16:9).
Incidentally, I did purchase a full DVI-I (dual link) to HDMI converter cable (the TV has 2 HDMI
sockets) but the TV didn't perform too well and I got a lot of screen size distortions and very
limited available resolution settings under both Windows and Linux, so hence my now using a VGA
cable.


Have you tried this?

Improving the font quality in Windows XP

By default, Microsoft Windows XP has smooth fonts enabled, however,
adjusting this setting can sometimes help improve the look and quality
of your text. To adjust or change settings within this feature, follow
the steps below.
Open the Control Panel
Double-click the Display icon.
Click the Appearance tab.
Click the Effects button.
Verify that the
'Use the below method to smooth edges of screen fonts'
box is checked. We recommend that you utilize the standard
method, but some users may find it better to use ClearType.
The appearance of a font can be improved by increasing your video
resolution.
Verify that your computer has the latest drivers for the video card.

http://www.computerhope.com/issues/ch000557.htm


  #3  
Old July 24th 16, 11:21 AM posted to microsoft.public.windowsxp.general
gargoyle60
external usenet poster
 
Posts: 69
Default Windows XP Font Rendering Issue

On Sun, 24 Jul 2016 02:55:22 -0700, Mike S wrote:

On 7/24/2016 2:05 AM, gargoyle60 wrote:
Need some help with my desktop computer...

Previously I have bee using an AOC Razor Monitor e2343F (23") monitor attached via DVI-D (single
link) cable and have always had a clear monitor image with sharp edges.

I have just swapped to a larger screen, an LCD 32" TV (LG 32LD450). This doesn't have a DVI socket
so I am using good old VGA. Graphics and colour appear crystal clear but I have noticed that
rendering of fonts (or text generally) is not so clear under Windows XP (but fine under Linux). I
have reinstalled the display driver but this hasn't cleared up he font rendering issue. There are no
reported font problems as far as I can tell.

Any suggestions please?

The graphics card is an ATI Saphire Radeon X300.
Screen dimensions/resolution settings: Windows:1360x768(16:9), Lunux:1920x1080(16:9).
Incidentally, I did purchase a full DVI-I (dual link) to HDMI converter cable (the TV has 2 HDMI
sockets) but the TV didn't perform too well and I got a lot of screen size distortions and very
limited available resolution settings under both Windows and Linux, so hence my now using a VGA
cable.


Have you tried this?

Improving the font quality in Windows XP

By default, Microsoft Windows XP has smooth fonts enabled, however,
adjusting this setting can sometimes help improve the look and quality
of your text. To adjust or change settings within this feature, follow
the steps below.
Open the Control Panel
Double-click the Display icon.
Click the Appearance tab.
Click the Effects button.
Verify that the
'Use the below method to smooth edges of screen fonts'
box is checked. We recommend that you utilize the standard
method, but some users may find it better to use ClearType.
The appearance of a font can be improved by increasing your video
resolution.
Verify that your computer has the latest drivers for the video card.

http://www.computerhope.com/issues/ch000557.htm


I have switched to using ClearType and it made quite a difference. I hadn't though of that.
Thanks for the help.
  #4  
Old July 24th 16, 03:56 PM posted to microsoft.public.windowsxp.general
JJ[_11_]
external usenet poster
 
Posts: 744
Default Windows XP Font Rendering Issue

On Sun, 24 Jul 2016 11:21:47 +0100, gargoyle60 wrote:

I have switched to using ClearType and it made quite a difference. I hadn't though of that.
Thanks for the help.


You might also want to try tuning it up using ClearType Tuner.

https://www.microsoft.com/typography/ClearTypePowerToy.mspx
  #5  
Old July 25th 16, 01:42 AM posted to microsoft.public.windowsxp.general
VanguardLH[_2_]
external usenet poster
 
Posts: 10,881
Default Windows XP Font Rendering Issue

gargoyle60 wrote:

Need some help with my desktop computer...

Previously I have bee using an AOC Razor Monitor e2343F (23") monitor attached via DVI-D (single
link) cable and have always had a clear monitor image with sharp edges.

I have just swapped to a larger screen, an LCD 32" TV (LG 32LD450). This doesn't have a DVI socket
so I am using good old VGA. Graphics and colour appear crystal clear but I have noticed that
rendering of fonts (or text generally) is not so clear under Windows XP (but fine under Linux). I
have reinstalled the display driver but this hasn't cleared up he font rendering issue. There are no
reported font problems as far as I can tell.

Any suggestions please?

The graphics card is an ATI Saphire Radeon X300.
Screen dimensions/resolution settings: Windows:1360x768(16:9), Lunux:1920x1080(16:9).
Incidentally, I did purchase a full DVI-I (dual link) to HDMI converter cable (the TV has 2 HDMI
sockets) but the TV didn't perform too well and I got a lot of screen size distortions and very
limited available resolution settings under both Windows and Linux, so hence my now using a VGA
cable.


Configure your video card to use the SAME resolution as the NATIVE
resolution of the monitor. Having to use interpolation to determine
what pixels to illuminate, how much, and for which color means
inaccurate guesses that accounts for artifacts, like color tinging and
loss of focus.

If your video card/controller cannot go as high as the native resolution
of the monitor then you're stuck with interpolation artifacts until you
also upgrade to a better video card.

Also, you got a larger monitor but how much increase did it have in its
native resolution over your old monitor? Twice the surface area does
not equate to twice the resolution. As the monitor gets bigger, and
because resolution does not increase linearly, you need to increase the
distance between you and the monitor. Sitting up close to a monitor
means you eventually get too close and can physically distinguish the
boundaries of the pixels. It will get grainy because you are too close.

According to its manual, your new monitor is only capable of 1920 x
1080p for its native resolution. I can get that in a 22" monitor, so
obviously increasing its size to 32" would mean the pixels get bigger
with more space between them so it would look more grainy unless I moved
farther away from it.

Is your video card capable of 1920 x 1080? "ATI Saphire Radeon X300"
only tells me it is a circa 2004-2005 hardware (over 10 years old), plus
it is an entry-level chip in their X300-X600 series. Look at
https://en.wikipedia.org/wiki/Radeon_X300-X600_Series and you'll see
that native resolution is not mentioned. I found an X300 card online
with specs saying it supported 2048 x 1536 at 85 Hz. So it should be
able to handle your new and bigger monitor and even at its native
resolution of 1920 x 1080.

Did you configure your X300 video card to use the maximum and native
resolution of 1920 x 1080 for the new monitor? If you go higher just
because the card can do so means artifacts from interpolation between
screen resolution, like 2048 x 1536, down to native resolution (1920 x
1080). I found an online sale of your old monitor. It's max and native
resolution is 1920 x 1080. So your old and new monitor have the SAME
max and native resolution. Well, you got a bigger monitor but the
resolution did not change. What did you think would happen when the
screen got bigger but the total number of pixels remained the same?

You are running the monitor at its max & native resolution of 1920 x
1080 when running Linux. What are you using a smaller screen resolution
when running Windows (which will incur interpolation artifacts)? You
always want to run the monitor at its native resolution.

With DVI or HDMI, the video card and monitor will negotiate to use the
monitor's native resolution unless you override that setting.
Apparently you only have a VGA connector (15-pin Dsub) on your old X300
video card. You can get a VGA-to-DVD or VGA-to-HDMI adapter to use on
the end of your old VGA cable but you will only get the VGA signals to
the monitor. There will be no backwards communication from monitor to
video card. That means *YOU* must properly configure the screen
resolution to match the monitor's native resolution. You did so under
Linux but for some reason chose not to do so under Windows.

Your old and new monitors both have a native (max) resolution of 1920 x
1080. Your video card could go higher but that would incur
interpolation artifacts. You should NOT go lower than the monitor's
native resolution, so configure your Windows to also use 1920 x 1080.
If text gets too small for your comfort at the higher resolution then
change the DPI setting in Windows (to use more dots per character
height). That will enlarge the text and also make it sharper because
more dots are used per inch to render each character.

Can't do anything about you going to a bigger monitor that still has the
same 1920 x 1080 native resolution as your old monitor. Bigger means
pixels and spacing is bigger making the screen more grainy meaning
you'll have to sit farther away from the monitor for the same smoothness
to which you were accustomed with the old smaller same resolution
monitor. If you had gone to a bigger monitor that had a higher
resolution then the graininess would've been a bit compensated.
However, a change from 1920 x 1080 (2,073,600 pixels) to 2048 x 1536
(3,145,728 pixels) would give an increase of 52% in pixel count for an
increase of ??? of surface area (don't know the paintable dimensions of
your old and new monitors). However, another problem crops up: change
in aspect ratio. 1920/1080 = 1.8 while 2048/1536 = 1.3. You would be
changing from a squashed to taller display. To put it another way, 1920
x 1080 high has a 16:9 aspect ratio while 2048 x 1536 has a 4:3 aspect
ratio. Usually people go to the higher resolution but pick one that is
16:9 and then wonder why all the characters in their games turned into
midgets. That's why some folks won't by landscape monitors. However,
you used to be at 16:9 so the higher resolution to give 4:3 would make
everything look much taller. 1920 to 2048 is not much of a change in
width but 1080 to 1536 is a significant change in height.

Your new monitor uses 16:9 just like your old monitor. So that did not
change. Your video card will support the old and new monitor's native
resolution (which is 1920 x 1080 for both). However, you changed from a
23" to 32" monitor. The new monitor is bigger without any increase in
resolution. Even if you did change Windows to use the 1920 x 1080
native resolution of your new monitor, and even if you enable ClearType
to fuzzy up the characters to make them look more contiguous between
pixels, there is still the problem that the amount of "paint" used to
coat your old smaller monitor is the same amount of "paint" used to coat
your new bigger monitor. Alas, the "paint" is not analog but pixelated
(a bunch of distinct dots). Dot paintings don't look as sharp when you
stretch the canvas unless you also step back. Pointillism doesn't work
when the canvas is enlarged unless the number of dots also increased.

http://divisionismgeorgesseurat.weeb...00692_orig.jpg
What do you think this painting would look like if you stretched the
canvas to twice its size but kept the number of dots the same? It would
get harder for you to connect the dots to recognize the content.

http://www.lg.com/us/products/docume..._32LD450-4.pdf
Max/native resolution = 1920 x 1080p (same as your old monitor)

Too bad when you spent the money on a new and bigger monitor that you
did not get one that had a higher resolution (within the capabilities of
your video card). Typically when I upgrade a video card or monitor, I
do them together so I get the max benefits of both. With a bigger
monitor, you want more pixels. With more pixels, you want a better
video card to handle that many.
  #6  
Old July 25th 16, 09:32 AM posted to microsoft.public.windowsxp.general
gargoyle60
external usenet poster
 
Posts: 69
Default Windows XP Font Rendering Issue

On Sun, 24 Jul 2016 19:42:04 -0500, VanguardLH wrote:

gargoyle60 wrote:

Need some help with my desktop computer...

Previously I have bee using an AOC Razor Monitor e2343F (23") monitor attached via DVI-D (single
link) cable and have always had a clear monitor image with sharp edges.

I have just swapped to a larger screen, an LCD 32" TV (LG 32LD450). This doesn't have a DVI socket
so I am using good old VGA. Graphics and colour appear crystal clear but I have noticed that
rendering of fonts (or text generally) is not so clear under Windows XP (but fine under Linux). I
have reinstalled the display driver but this hasn't cleared up he font rendering issue. There are no
reported font problems as far as I can tell.

Any suggestions please?

The graphics card is an ATI Saphire Radeon X300.
Screen dimensions/resolution settings: Windows:1360x768(16:9), Lunux:1920x1080(16:9).
Incidentally, I did purchase a full DVI-I (dual link) to HDMI converter cable (the TV has 2 HDMI
sockets) but the TV didn't perform too well and I got a lot of screen size distortions and very
limited available resolution settings under both Windows and Linux, so hence my now using a VGA
cable.


Configure your video card to use the SAME resolution as the NATIVE
resolution of the monitor. Having to use interpolation to determine
what pixels to illuminate, how much, and for which color means
inaccurate guesses that accounts for artifacts, like color tinging and
loss of focus.

If your video card/controller cannot go as high as the native resolution
of the monitor then you're stuck with interpolation artifacts until you
also upgrade to a better video card.

Also, you got a larger monitor but how much increase did it have in its
native resolution over your old monitor? Twice the surface area does
not equate to twice the resolution. As the monitor gets bigger, and
because resolution does not increase linearly, you need to increase the
distance between you and the monitor. Sitting up close to a monitor
means you eventually get too close and can physically distinguish the
boundaries of the pixels. It will get grainy because you are too close.

According to its manual, your new monitor is only capable of 1920 x
1080p for its native resolution. I can get that in a 22" monitor, so
obviously increasing its size to 32" would mean the pixels get bigger
with more space between them so it would look more grainy unless I moved
farther away from it.

Is your video card capable of 1920 x 1080? "ATI Saphire Radeon X300"
only tells me it is a circa 2004-2005 hardware (over 10 years old), plus
it is an entry-level chip in their X300-X600 series. Look at
https://en.wikipedia.org/wiki/Radeon_X300-X600_Series and you'll see
that native resolution is not mentioned. I found an X300 card online
with specs saying it supported 2048 x 1536 at 85 Hz. So it should be
able to handle your new and bigger monitor and even at its native
resolution of 1920 x 1080.

Did you configure your X300 video card to use the maximum and native
resolution of 1920 x 1080 for the new monitor? If you go higher just
because the card can do so means artifacts from interpolation between
screen resolution, like 2048 x 1536, down to native resolution (1920 x
1080). I found an online sale of your old monitor. It's max and native
resolution is 1920 x 1080. So your old and new monitor have the SAME
max and native resolution. Well, you got a bigger monitor but the
resolution did not change. What did you think would happen when the
screen got bigger but the total number of pixels remained the same?

You are running the monitor at its max & native resolution of 1920 x
1080 when running Linux. What are you using a smaller screen resolution
when running Windows (which will incur interpolation artifacts)? You
always want to run the monitor at its native resolution.

With DVI or HDMI, the video card and monitor will negotiate to use the
monitor's native resolution unless you override that setting.
Apparently you only have a VGA connector (15-pin Dsub) on your old X300
video card. You can get a VGA-to-DVD or VGA-to-HDMI adapter to use on
the end of your old VGA cable but you will only get the VGA signals to
the monitor. There will be no backwards communication from monitor to
video card. That means *YOU* must properly configure the screen
resolution to match the monitor's native resolution. You did so under
Linux but for some reason chose not to do so under Windows.

Your old and new monitors both have a native (max) resolution of 1920 x
1080. Your video card could go higher but that would incur
interpolation artifacts. You should NOT go lower than the monitor's
native resolution, so configure your Windows to also use 1920 x 1080.
If text gets too small for your comfort at the higher resolution then
change the DPI setting in Windows (to use more dots per character
height). That will enlarge the text and also make it sharper because
more dots are used per inch to render each character.

Can't do anything about you going to a bigger monitor that still has the
same 1920 x 1080 native resolution as your old monitor. Bigger means
pixels and spacing is bigger making the screen more grainy meaning
you'll have to sit farther away from the monitor for the same smoothness
to which you were accustomed with the old smaller same resolution
monitor. If you had gone to a bigger monitor that had a higher
resolution then the graininess would've been a bit compensated.
However, a change from 1920 x 1080 (2,073,600 pixels) to 2048 x 1536
(3,145,728 pixels) would give an increase of 52% in pixel count for an
increase of ??? of surface area (don't know the paintable dimensions of
your old and new monitors). However, another problem crops up: change
in aspect ratio. 1920/1080 = 1.8 while 2048/1536 = 1.3. You would be
changing from a squashed to taller display. To put it another way, 1920
x 1080 high has a 16:9 aspect ratio while 2048 x 1536 has a 4:3 aspect
ratio. Usually people go to the higher resolution but pick one that is
16:9 and then wonder why all the characters in their games turned into
midgets. That's why some folks won't by landscape monitors. However,
you used to be at 16:9 so the higher resolution to give 4:3 would make
everything look much taller. 1920 to 2048 is not much of a change in
width but 1080 to 1536 is a significant change in height.

Your new monitor uses 16:9 just like your old monitor. So that did not
change. Your video card will support the old and new monitor's native
resolution (which is 1920 x 1080 for both). However, you changed from a
23" to 32" monitor. The new monitor is bigger without any increase in
resolution. Even if you did change Windows to use the 1920 x 1080
native resolution of your new monitor, and even if you enable ClearType
to fuzzy up the characters to make them look more contiguous between
pixels, there is still the problem that the amount of "paint" used to
coat your old smaller monitor is the same amount of "paint" used to coat
your new bigger monitor. Alas, the "paint" is not analog but pixelated
(a bunch of distinct dots). Dot paintings don't look as sharp when you
stretch the canvas unless you also step back. Pointillism doesn't work
when the canvas is enlarged unless the number of dots also increased.

http://divisionismgeorgesseurat.weeb...00692_orig.jpg
What do you think this painting would look like if you stretched the
canvas to twice its size but kept the number of dots the same? It would
get harder for you to connect the dots to recognize the content.

http://www.lg.com/us/products/docume..._32LD450-4.pdf
Max/native resolution = 1920 x 1080p (same as your old monitor)

Too bad when you spent the money on a new and bigger monitor that you
did not get one that had a higher resolution (within the capabilities of
your video card). Typically when I upgrade a video card or monitor, I
do them together so I get the max benefits of both. With a bigger
monitor, you want more pixels. With more pixels, you want a better
video card to handle that many.


Wow. That's a lot of info to take in on a Monday morning. Thanks.
Now I will rub the sleep from the corners of my eyes...

I have always been very pleased with my old AC 23".
My only reason for switching to this 32" size was because I inherited the TV, so it was free.

After posting my original message I had already switched to 1920x1080 for Windows as well.
With a few adjustments (ClearType, DPI, etc.) I am now getting a good quality picture with clear
edges. In all cases I personally prefer the 16:9 aspect ratio. 4.3 seems so antiquated to me.

Yes, the card can go up to 85 Hz with larger resolution. I tried 2048x1536 @ 85HZ but 1920x1080 is
adequate for my needs. I don't do gaming, so presently I see no reason to upgrade to a better
graphics card.

The card has both VGA and DVI-I (Dual Link) 29-pin. As I mentioned, I tried using a DVI-HDMI
converter cable but for some reason the 32" didn't perform well. It's old as is most of my
equipment, as you pointed out.

From your advice I will certainly consider upgrading both card and monitor together in the future
should I ever need to. Thanks for all the information. Very interesting.

Curiously, when I tried 2048x1536 @ 85HZ I noticed some elements disappearing beyond the edges of
the screen - task bar, desktop icons, etc. This happened in both Windows and Linux. I didn't try to
correct this but I assume that changes to certain settings could fix this problem?
  #7  
Old July 25th 16, 09:53 AM posted to microsoft.public.windowsxp.general
VanguardLH[_2_]
external usenet poster
 
Posts: 10,881
Default Windows XP Font Rendering Issue

gargoyle60 wrote:

In all cases I personally prefer the 16:9 aspect ratio. 4.3 seems so
antiquated to me.


With my limited experience (under 250 users), those that like 16:9
mostly use graphical applications (watch movies, play games, image or
video editing, etc) and they like more screen realestate to pollute
their desktops with lots more icons. Those that like 4:3 work mostly on
documentation. They like the more square layout and especially
appreviate the higher vertical resolution so characters are more crisp.
It also depends on what you were use to before. I had a 4:3 monitor,
went to a 16:9 one, stuff looked squashed in games until I upgraded or
moved to later games that adapted to the changed aspect ratio. The
monitor changed and I (and software) adapted so now 16:9 is acceptable.
Those that had a 4:3 monitor often did not like going to a 16:9 monitor.
In fact, for some users, I had to specifically get then a 4:3 monitor so
they were comfortable with the aspect ratio to which they had become
accustomed. They wanted to continue working rather than spend time and
effort to adapt.

It's all part of tweaking the feature set of your computer. Some users
like lots of noise for the desktop. They pick background images and
even include slide shows. I prefer a single color for the background
(in fact, I went to black and use a dark blue for the Windows taskbar
since I want functionality over glitz). Tweaking is why so many users
like Windows: you can almost find a way to do whatever you want rather
than you having to adapt to a specific shaped box. Punch a hole there,
squish out the walls to conform to your shape, add padding, etc.

Yes, the card can go up to 85 Hz with larger resolution. I tried
2048x1536 @ 85HZ but 1920x1080 is adequate for my needs. I don't do
gaming, so presently I see no reason to upgrade to a better graphics
card.


You wouldn't want to go above 1920 x 1080. That is the native
resolution of both your old and new monitors. While 2048 x 1536 would
give a big jump in the number of pixels your video card could display,
it cannot make the monitor magically acquire any more pixels. You want
to use a video card resolution that matches the monitor's native
resolution.

Actually you would see much less change in quality of picture in games
than in documents or static graphics. Your eyes (brain) accomodate
movement with blur to allow for "seeing" continuous movement. That is
why stores will always have moving pictures or movies showing on the
computer monitors because it makes more difficult for you to see what
the monitor can actually do with static text and static images. What
looks good when playing a movie may be poor quality when viewing a
static screen.

Curiously, when I tried 2048x1536 @ 85HZ I noticed some elements
disappearing beyond the edges of the screen - task bar, desktop
icons, etc. This happened in both Windows and Linux. I didn't try to
correct this but I assume that changes to certain settings could fix
this problem?


You are trying to use a screen resolution beyond the monitor's
capability. It was not designed to operate at the frequencies or screen
size of the higher resolution. You video card, although old, can
support a higher screen resolution than can your old and new monitors.
  #8  
Old July 25th 16, 10:13 AM posted to microsoft.public.windowsxp.general
gargoyle60
external usenet poster
 
Posts: 69
Default Windows XP Font Rendering Issue

On Mon, 25 Jul 2016 03:53:02 -0500, VanguardLH wrote:

Your eyes (brain) accomodate
movement with blur to allow for "seeing" continuous movement. That is
why stores will always have moving pictures or movies showing on the
computer monitors because it makes more difficult for you to see


Yes, I noticed that. Cheaky tactic.
  #9  
Old July 25th 16, 11:46 PM posted to microsoft.public.windowsxp.general
J. P. Gilliver (John)
external usenet poster
 
Posts: 5,291
Default Windows XP Font Rendering Issue

In message , VanguardLH
writes:
[]
Configure your video card to use the SAME resolution as the NATIVE
resolution of the monitor. Having to use interpolation to determine
what pixels to illuminate, how much, and for which color means
inaccurate guesses that accounts for artifacts, like color tinging and
loss of focus.


In theory, using an exact _sub_multiple (i. e. half, one-third,
one-quarter) doesn't cause interpolation; however, in practice these
make for such low resolutions that nobody would want to use them -
except in some cases of people with very bad eyesight, who _need_ the
low resolution. They also make for unusual rates (e. g. 960 × 540) which
a lot of video cards can't do.
[]
Apparently you only have a VGA connector (15-pin Dsub) on your old X300
video card. You can get a VGA-to-DVD or VGA-to-HDMI adapter to use on
the end of your old VGA cable but you will only get the VGA signals to
the monitor. There will be no backwards communication from monitor to
video card. That means *YOU* must properly configure the screen
resolution to match the monitor's native resolution. You did so under
Linux but for some reason chose not to do so under Windows.


VGA does have _some_ reverse data (e. g. whatever is needed to give
"plug-and-play"), but you are right, there's not enough in the standard
to do a resolution handshake,
[]
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Illinc fui et illud feci, habe tunicam?
  #10  
Old July 25th 16, 11:59 PM posted to microsoft.public.windowsxp.general
J. P. Gilliver (John)
external usenet poster
 
Posts: 5,291
Default Windows XP Font Rendering Issue

In message , gargoyle60
writes:
[]
After posting my original message I had already switched to 1920x1080

[]
Curiously, when I tried 2048x1536 @ 85HZ I noticed some elements
disappearing beyond the edges of
the screen - task bar, desktop icons, etc. This happened in both
Windows and Linux. I didn't try to
correct this but I assume that changes to certain settings could fix
this problem?


Possibly, but it'd be non-optimum: the screen displays 1920×1080,
whatever you feed it with, so anything else will cause it to blur across
the pixels. Also, you'd be feeding a 16:9 monitor with a 4:3 signal, so
it would struggle, or produce a distorted picture, possibly with bits
cropped off.

If you told Windows to use that resolution, it would I'm pretty sure
still include the task bar, and - since it is a _higher_ resolution -
all the icons you had at the lower one: so, they're in what's coming out
of the computer. Whether you could adjust the monitor's width and height
controls to bring them back on screen, I'm not sure, but you wouldn't
get any better a picture - almost certainly worse (blurred, and
stretched horizontally if it showed the full video signal).
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

"Tolerating intolerance is not a virtue." - Barry Shein
  #11  
Old July 26th 16, 01:24 AM posted to microsoft.public.windowsxp.general
VanguardLH[_2_]
external usenet poster
 
Posts: 10,881
Default Windows XP Font Rendering Issue

J. P. Gilliver (John) wrote:

VGA does have _some_ reverse data (e. g. whatever is needed to give
"plug-and-play"), but you are right, there's not enough in the standard
to do a resolution handshake,


when I first switched from VGA out from the video card to DVI (after
getting a monitor with DVI in), I noticed the resolution was
automatically selected for me. Some entries is the monitor's OSD got
disabled and some entries in the display settings within Windows got
disabled. Huh? Then I figured out that the monitor was giving info to
the video card telling what it could handle and the video card honored
those settings.

Previously I had to retrieve an .inf file for the monitor to install
into Windows so Windows knew what resolutions and frequencies the
monitor could handle. The user had to tell Windows via INF file what it
could tell the video card to use with that monitor. Not all monitors
had their .inf files included in an install of Windows so quite often I
had to go to the monitor maker's web site to get the .inf file. I was
still in the habit of doing that after using DVI only to get baffled why
the monitor maker's weren't supplying the .inf files that I thought were
required. They aren't required anymore (with DVI or HDMI).

The pin-out for a 15-pin Dsub VGA connector is:

Top row Middle row Bottom row
Pin 1: Red Pin 6: Red ground Pin 11: ID0 (gnd=color, NC=mono)
Pin 2: Green Pin 7: Green ground Pin 12: ID1 (unused; or gnd=mono, NC=color)
Pin 3: Blue Pin 8: Blue ground Pin 13: Horizontal sync
Pin 4: reserved Pin 9: unused Pin 14: Vertical sync
Pin 5: Ground Pin 10: Ground Pin 15: unused
(where NC means no connection too float/pull high)

Horiz and vert sync were not 2-way communicated. It was the video card
sending one-way a sync signal to the monitor; i.e., just more dumb
analog signaling. The only information that Windows knew about was you
had a monitor attached and listed it as a standard [CRT] monitor. ID0
let Windows know if it was a color or monochrome monitor. Not much of
plug-n-play functionality there. It is possible to employ the unused
pins with a matched monitor and video card to pass more info but this
would be vendor specific and require a matched set of devices and often
required mfr-supplied specialty cables. Obviously that was not
Plug-n-Play functionality (to simply see a monitor was connected or not,
that it was color or monochrome, and maybe a matched vendor-specific
monitor-video-cable set might provide more info to a vendor-specific
specialty driver).

So how did VGA ever get Plug-n-Play functionality? At some point, VGA
got redefined to use the previously unused pins of 11, 12, 4, and 15 as
a matrix for monitor identification. Alas, there lots of monitors and
video cards that did not support this matrix so you got stuck with
Windows only knowing a color/mono monitor was attached. Then came the
DDC spec.

https://en.wikipedia.org/wiki/Display_Data_Channel
https://en.wikipedia.org/wiki/Extend...ification_Data

In the 2nd article is a table showing EDID values. Lots more
information was available but that doesn't mean the monitor supported
them all, provided them in standard presentation format, the video card
would pass them, and that Windows knew what to do with all the values.
S.M.A.R.T. is a standard, too, but HDD makers were too, um,
"imaginative" in their own definitions.
  #12  
Old July 26th 16, 10:57 PM posted to microsoft.public.windowsxp.general
J. P. Gilliver (John)
external usenet poster
 
Posts: 5,291
Default Windows XP Font Rendering Issue

In message , VanguardLH
writes:
J. P. Gilliver (John) wrote:

VGA does have _some_ reverse data (e. g. whatever is needed to give
"plug-and-play"), but you are right, there's not enough in the standard
to do a resolution handshake,

[]
So how did VGA ever get Plug-n-Play functionality? At some point, VGA
got redefined to use the previously unused pins of 11, 12, 4, and 15 as
a matrix for monitor identification. Alas, there lots of monitors and
video cards that did not support this matrix so you got stuck with
Windows only knowing a color/mono monitor was attached. Then came the
DDC spec.

[]
Thanks, you've explained how, I've plugged in VGA monitors and had the
computer know something about them.
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Anybody can garble quotations like that -- even with the Bible... Er... "And he
went and hanged himself (Matthew 27:5). Go, and do thou likewise (Luke 10:37)."
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off






All times are GMT +1. The time now is 04:47 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.