Thread: Avoid 10 !
View Single Post
  #62  
Old January 14th 18, 04:28 AM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Avoid 10 !

Neil wrote:
On 1/13/2018 7:54 PM, Paul wrote:
Neil wrote:


I see. What people in the world are doing only matters if it matches
your opinion of what is worth doing. That's a ****-poor test design.


Bud, if you add too many layers to a software
stack, what are you testing exactly ? Which layer
is broken ? Which layer sucks ?

(rest snipped for irrelevance)

That's a pretty off-the-wall attempt at obfuscation to avoid the point
at hand. There are *many* things that *can* be done in Win10 that *can
not be done in Win7*. Whether you or I want or need to do those things
doesn't matter, because there are increasing numbers of people that do.

Why not just hang out in a Win7 ng rather than impose such unjustifiable
insults on people?


I'm here to correct mis-impressions.

These OSes are mostly the same on *compute* performance.

Time and again I see claims of clever optimization
that simply don't exist.

If you have an expensive enough platform, a multi-socket
server motherboard, an Epyc, I'm willing to bet there
is a difference between Win10 and Win7. For ordinary
computers (quad core), the difference is zilch. One
*computes* with the same speed as the other. Computing
meaning numerical calculations, which the OS plays no part in.
The OS schedules and gives the process time slices, and
the process does the rest. If the OS is not generous with
time slices (it reserves cycles for itself), your
compute speed suffers. This can be solved on Windows 10
by forking more threads than virtual cores (a.k.a
over-subscription). This is a technique the Chromium
build environment uses, in spades.

for other parts, again, there can be differences.
DX12 is 30% faster than DX11. I have no games and
haven't tested that claim. But it makes sense, since
one version of DirectX managed individual "objects"
(many context switches), while a later one switched
back to game-designer-controlled display lists. If you
have a game that support both DX flavors, you can switch
between them and check for a speed difference. The difference
could be significant. As I'm not a gamer, I don't
particularly care at the moment, which DirectX version
was a mistake, but for a gamer, this does give an
incentive to move to an OS that supports the "good"
version of DirectX.

But other stuff, you'd have to craft a pretty careful
test case, to state with any authority that the new
OS is actually better. It's merely different, a few
deck chairs moved around and so on. Buttery smooth
animation is great for content consumption (copying
what smartphones look like), but unnecessary for
content creation. Or anything requiring close-to-the-iron
performance.

I'm from the club, that years ago when Apple had ZoomRects,
I and other users turned it off. That was an example of
a (not very smooth) window opening animation. The joke
at the time was "I got a second of my life back", referring
to the time it took for the animation to play, and for the
Windows to open. I'm consistent today, with the belief
that buttery animations for things aren't necessary, but I
guess somebody likes them. They don't work well on
computers that use the Basic Display Adapter driver (which
include virtual machine usage, it makes the Guest slower).

Paul
Ads