If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#106
|
|||
|
|||
Does Vista Actually Want 18 Gigabytes Of Disc Space?
Today, cquirke (MVP Windows shell/user) made these interesting
comments ... Nobody debugs until bug free because it can be shown mathematically to be impossible, never mind its cost. Yep, it's a Goedel thing. It's debatable whether there really is a schism between determanism and chaos, or whether it's just a matter of computational power limitations. With enough energy, can you break molecules, atoms, electrons...? With enough computational power, can you break DES, or the mysteries of turbulance, or predict the stock market? In producing anything, hard or soft, the Law of Dimishing Returns is what determines how many bugs you will squash. You CAN rid the product of more, but it will take longer and cost more, both of which are denied to you as the developer by your manager. If you take the "bugs tend towards zero but never really reach zero" perspective, then how far down the slope you need to go depends on how complex is the system you're debugging. A 1:100 error rate may be low enough to write 5-line batch files, and 1:1000000 error rate may be OK for DOS and small apps... but for today's code, processed at over a million instructions a millisecond, you're unlikely to beat Murphy. Exactly. Suppose, for example, that operating systems and apps were sold Sears-style: good, better, and best. YOu could then choose by price alone the ones that have the most bugs and bloat to the least bugs and bloat. Wish it worked that way, but it doesn't. The degree of bug-freeness and least bloat is an economic compromise sort of thing in order to bring the product in on-time at the price target selected by management. And, yes, computer aided testing is a great help, as is statistical analysis to predict the impact of remaining problems and to predict how the product will work. So, too, is alpha and beta testing, inclunding beta testing by the Visa cars of the early adopters. optimization is just one of many criteria used in development of ANY software, and again is a compromise based almost exclusively on the express purpose of the company - to generate profits. Optimization is fading from significance as hardware capabilities improve, and reliability needs increase. That is true, but just think of the performance you would see if developers didn't take this approach! The beauty of an elegant algorithm is still there, but of all the tricks and tweaks one used to dance around the lumpy limitations of hardware, I doubt if any remain relevant a decade later. Hand-crafted, bug-free Assembler code is like diamonds; essential in a narrow range of contexts, but far too costly to use as the sole material for buildings, oil tankers, aircraft etc. That's why Win95 kept some 16-bit code for the UI; it was exactly this quality of highly-optimized, reliable code that is well worth keeping. The challenge is how to accept that code will always be buggy, and design reliable systems anyway. I think a problem we have is that devs are in denial on this; they still reckon that if they "try harder", they can use code as if it were reliable and trustworthy. This is like trying to divide by zero by Thinking Really Hard. -- HP, aka Jerry |
Ads |
#107
|
|||
|
|||
Does Vista Actually Want 18 Gigabytes Of Disc Space?
On Mon, 12 Mar 2007 12:10:34 -0000, "HEMI-Powered" wrote:
"Today", cquirke (MVP Windows shell/user) Nobody debugs until bug free because it can be shown mathematically to be impossible, never mind its cost. Yep, it's a Goedel thing. It's debatable whether there really is a schism between determanism and chaos In producing anything, hard or soft, the Law of Dimishing Returns is what determines how many bugs you will squash. You CAN rid the product of more, but it will take longer and cost more, both of which are denied to you as the developer by your manager. Yep. Often the most solid inventors get left out of the loop, because they can't "sign off and ship"; Tesla comes to mind there. Exactly. Suppose, for example, that operating systems and apps were sold Sears-style: good, better, and best. YOu could then choose by price alone the ones that have the most bugs That relies on "you get what you pay for", which is not a truism, but rather like the waterline on an iceberg. It's best-case; usually you get a lot less than what you pay for. And then there's the "Nike Effect", i.e. meaningless "brand tax"... If you had the computational power to ascertain the true bug load of an item of s(oft)ware, you'd have written it ;-) You could, however, pay a premium based on the degree of pain the vendor's contractually obliging themselves to undergo, should the product fall short of expectations. Optimization is fading from significance as hardware capabilities improve, and reliability needs increase. That is true, but just think of the performance you would see if developers didn't take this approach! Indeed... sometimes one might want a skeletal OS that does nothing more than hand over full control to a single encapsulated app; rather like what a thing called "DOS" used to do :-) --------------- ---- --- -- - - - - Saws are too hard to use. Be easier to use! --------------- ---- --- -- - - - - |
#108
|
|||
|
|||
Does Vista Actually Want 18 Gigabytes Of Disc Space?
Today, cquirke (MVP Windows shell/user) made these interesting
comments ... In producing anything, hard or soft, the Law of Dimishing Returns is what determines how many bugs you will squash. You CAN rid the product of more, but it will take longer and cost more, both of which are denied to you as the developer by your manager. Yep. Often the most solid inventors get left out of the loop, because they can't "sign off and ship"; Tesla comes to mind there. Yep, he bolted when his boss wouldn't listen. Exactly. Suppose, for example, that operating systems and apps were sold Sears-style: good, better, and best. YOu could then choose by price alone the ones that have the most bugs That relies on "you get what you pay for", which is not a truism, but rather like the waterline on an iceberg. It's best-case; usually you get a lot less than what you pay for. Of course, I was being facious here ... And then there's the "Nike Effect", i.e. meaningless "brand tax"... If you had the computational power to ascertain the true bug load of an item of s(oft)ware, you'd have written it ;-) If I thought I could write an O/S maybe I would, but there are few companies large enough to successfully do that today, and impossible for individuals. Linux is a cooperative effort of many programmers using its open architecture, much as Unix was intended to be a generation ago, but it lost its way when the HW manufacturers wanted proprietary versions for marketing reasons. You could, however, pay a premium based on the degree of pain the vendor's contractually obliging themselves to undergo, should the product fall short of expectations. That's a thought! See who can stand the most pain, the developer or the consumer! grin Optimization is fading from significance as hardware capabilities improve, and reliability needs increase. That is true, but just think of the performance you would see if developers didn't take this approach! Indeed... sometimes one might want a skeletal OS that does nothing more than hand over full control to a single encapsulated app; rather like what a thing called "DOS" used to do :-) I did a LOT of useful work on my old Apple //e with just 128KB memory and a suite called Appleworks. And, I did a lOT of useful work in the DOS days. I do a lot of useful work today, and it is much easier to do thanks to standards of data interchange and standards for how all apps and utilities to adhere to so as to make the learning curve as shallow as possible. -- HP, aka Jerry |
Thread Tools | |
Display Modes | |
|
|