View Single Post
  #106  
Old March 12th 07, 12:10 PM posted to microsoft.public.windows.vista.general,microsoft.public.windowsxp.customize
HEMI-Powered
external usenet poster
 
Posts: 291
Default Does Vista Actually Want 18 Gigabytes Of Disc Space?

Today, cquirke (MVP Windows shell/user) made these interesting
comments ...

Nobody debugs until bug free because it can be shown
mathematically to be impossible, never mind its cost.


Yep, it's a Goedel thing. It's debatable whether there really
is a schism between determanism and chaos, or whether it's
just a matter of computational power limitations.

With enough energy, can you break molecules, atoms,
electrons...?

With enough computational power, can you break DES, or the
mysteries of turbulance, or predict the stock market?


In producing anything, hard or soft, the Law of Dimishing Returns
is what determines how many bugs you will squash. You CAN rid the
product of more, but it will take longer and cost more, both of
which are denied to you as the developer by your manager.

If you take the "bugs tend towards zero but never really reach
zero" perspective, then how far down the slope you need to go
depends on how complex is the system you're debugging. A
1:100 error rate may be low enough to write 5-line batch
files, and 1:1000000 error rate may be OK for DOS and small
apps... but for today's code, processed at over a million
instructions a millisecond, you're unlikely to beat Murphy.


Exactly. Suppose, for example, that operating systems and apps
were sold Sears-style: good, better, and best. YOu could then
choose by price alone the ones that have the most bugs and bloat
to the least bugs and bloat. Wish it worked that way, but it
doesn't. The degree of bug-freeness and least bloat is an
economic compromise sort of thing in order to bring the product
in on-time at the price target selected by management. And, yes,
computer aided testing is a great help, as is statistical
analysis to predict the impact of remaining problems and to
predict how the product will work. So, too, is alpha and beta
testing, inclunding beta testing by the Visa cars of the early
adopters.

optimization is just one of many criteria used in development
of ANY software, and again is a compromise based almost
exclusively on the express purpose of the company - to
generate profits.


Optimization is fading from significance as hardware
capabilities improve, and reliability needs increase.


That is true, but just think of the performance you would see if
developers didn't take this approach!

The beauty of an elegant algorithm is still there, but of all
the tricks and tweaks one used to dance around the lumpy
limitations of hardware, I doubt if any remain relevant a
decade later.

Hand-crafted, bug-free Assembler code is like diamonds;
essential in a narrow range of contexts, but far too costly to
use as the sole material for buildings, oil tankers, aircraft
etc. That's why Win95 kept some 16-bit code for the UI; it
was exactly this quality of highly-optimized, reliable code
that is well worth keeping.

The challenge is how to accept that code will always be buggy,
and design reliable systems anyway. I think a problem we have
is that devs are in denial on this; they still reckon that if
they "try harder", they can use code as if it were reliable
and trustworthy.

This is like trying to divide by zero by Thinking Really Hard.


--
HP, aka Jerry
Ads