A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Microsoft Windows XP » Security and Administration with Windows XP
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Microsoft Zero Day security holes being exploited



 
 
Thread Tools Display Modes
  #91  
Old October 5th 06, 12:27 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Dan W.
external usenet poster
 
Posts: 30
Default Microsoft Zero Day security holes being exploited

cquirke (MVP Windows shell/user) wrote:
On Sat, 30 Sep 2006 06:55:35 -0600, "Dan W." wrote:
cquirke wrote:
On Fri, 29 Sep 2006 05:50:14 -0600, Dan wrote:


and what I really need besides your advice on domains is a good article
about domains that I can read when I get a chance
That info is out there; in fact, it's the main thrust of most formal
MS tech training etc. It's really powerful but very detailed stuff,
with a fair number of cotchas and complications. For example, what
happens to a system that has domain control over its settings, when it
isn't connected to the domain?


Thanks for the great replies as usual. I hope someone can answer your
question since I do not know.


AFAIK, what happens is that a copy of the domain's settings are kept
locally, and are used whenever the domain is unreachable. I guess
this copy would be updated whenever the domain is there.

There's also a lot of detail and granularity when different
permissions are combined. Whereas *NIX uses the same structure for
both directory location and permissions, the NT security model does
not - while files within a subtree start with permissions of the
parent (AFAIK), you can change this on a file-by-file basis.

There are easy ways to get really painted into a corner with this
stuff, and one of the common mistakes is to assign rights to
particular users, rather than to a group. It's better to create a
group, set the rights for that group, and then add your user(s) as
members of that group (yes, even if there's only one member). That
way, if you fire Fred and employ Brad, you just drop Fred from the
group and add Brad to it.

Often there will be contexts where different sets of permissions are
simultaneously applied. For example, there are machine permissions,
network permissions, user permissions, etc. so what really happens is
a resultant of these, prompting the question; what trumps what?

In many ways, a sysamin's job is as much about managing users via
Active Directory as it is about managing network resources such as
domain servers. Most businesses large enough to be using AD and
domains will insist on certification (MCSE etc.) before anyone can
touch this stuff. So when this security model is dropped into
consumerland, it's tough... consumers understand physical security
very well, but have zero intuition on business and staff security.

And why should they?

I was working on a machine for a couple of hours that had been messed
big time. I removed some spyware such as cool web junk and wild tangent
junk. The antivirus scanner did not even work -- it had been messed


Yup. I use Bart for those... the learning curve (OK, small wall) is
tougher than one would like, but if you do a lot of this stuff, it's
effort well spent. I expect malware to assume control over the system
I'm trying to clean, and start "from orbit" with Bart, concentrating
on the heavies, before tip-toeing in via Safe Cmd etc.

Safe Cmd is to XP what DOS mode is to Win9x, but there's a far higher
risk of malware being active in Safe Cmd than there is in DOS mode.

Spybot -- Search and Destroy actually was the only scanner that
removed and detected the junk out of all of them I used but that might
have just been because of the order that I ran the scanners in.


Could be... I use 7 av scanners and the usual 2 anti-"spyware"
scanners, then HiJackThis, then I de-bulk the usual malware hangouts
(loose code in C:\, all TIF, Temp), then I drop tools in place and run
'em when I enter Safe Cmd. The av scans shoot to kill, but the
initial anti-"spyware" and HiJackThis are usually look-don't-touch.

Once in Safe Cmd, I re-run SysClean (as some tests don't run when in
Bart), AdAware and Spybot, and this time I let the anti-"spyware"
scanners kill what they find. Then I add Ewido 4 and run that, do a
HiJackThis again, and look for mismatches that suggest a rootkit.

Next is normal Windows, which means I can install tools that require
the Windows Installer, e.g. BitDefender 8 and MS Defender. I add
BitDefender 8 if there's been a lot of traffic and/or the resident av
can't be updated. If the resident av is broken, expired or missing, I
add AVG 7. The I harden settings, set a clean baseline restore point,
and purge all older restore points (Disk Cleanup).

Then I check firewall, and go online to update the scanners and
non-scanning tools that need it (e.g. Spyware Blaster, Ewido,
BitDefender). Before going online, I'd have killed off old Java
versions and rreplaced the latest JRE, ditto Firefox, etc.

installed AVG and proceeded to do a complete scan for viruses in the
system. The system froze up once and I had to pull out the power cord
and reinsert to force a reset -- oh by the way this was an XP
Professional machine --- and guess what -- error at the BIOS level.


What sort of error?

Malware isn't the only thing that can bonk PCs; I didn't mention it,
but every Bart session starts with HD Tune to check physical HD, and
before that comes a few hours in MemTest86.



------------ ----- --- -- - - - -

Drugs are usually safe. Inject? (Y/n)
------------ ----- --- -- - - - -


Wow, I will need to check out Bart from your website to read up on it.

--
Dan W.

Computer User
Ads
  #92  
Old October 5th 06, 12:28 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security,microsoft.public.win98.gen_discussion
Dan W.
external usenet poster
 
Posts: 30
Default Microsoft Zero Day security holes being exploited

jez wrote:
solution::::: dont use explorer.



Okay, what is your alternative and why do you use it?

--
Dan W.

Computer User
  #93  
Old October 5th 06, 12:32 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Dan W.
external usenet poster
 
Posts: 30
Default Microsoft Zero Day security holes being exploited

cquirke (MVP Windows shell/user) wrote:
On Sat, 30 Sep 2006 07:31:27 -0600, "Dan W."
Karl Levinson, mvp wrote:
cquirke wrote in


Everyone needs to know that all computers are somewhat vulnerable if
they are connected to the Internet no matter what the defense protocol
procedures that are used to safeguard the system(s) and the network(s).


Until someone runs something on the system that initiates traffic,
there's no reason why they should be, unless there's an exploitable
surface in whatever first receives raw TCP/IP packets.

The trouble is, NT is designed to treat the Internet as a network, in
the sense that if you wave the correct credentials, you'd be able to
log in or otherwise interact with the system from "outside". That
adds additional exploitable surfaces.

I can think of NO circumstances where I'd want any Internet entity
that I had not initiated interaction with, to log onto to my PC,
access file shares, or make RPC calls - so why expose those services
at all? There's no "right" credentials to get in because I don't want
*anyone* to get in, so why even process such attempts?

It is indeed a good idea to have user accounts that have less privileges
than the admin. accounts do.


I'd rather have zero possible access from the Internet, be it as admin
or as limited user. The per-user model just isn't that useful,
especially where there is only one user. Why should I pretend to be a
staff of different job descriptions just to use my own PC?

The really sad thing - sadder even than all those games and accounting
apps that won't run unless you're admin - is that end users have no
control over how new user accounts are born. For me, that absolutely
kills the usefullness of user accounts.

I don't feel at all safe when half the files on the system are hidden
from me, where I can't easily tell if I'm in C:\TEMP, C:\D&S...\Temp
or \\BossPC\Windows\Temp, and where I'm expected to "open" files
without any visible cue as to what they will do.

Yet that is the state I'm forced to live with on any newly-created
user account - frankly, I feel safer as admin and "open eyes".



------------ ----- --- -- - - - -

Drugs are usually safe. Inject? (Y/n)
------------ ----- --- -- - - - -


Exactly, at least if all users have admin. rights then why not just set
up profiles with full user rights to everyone and install tracking
software to see what user did what. The company could then keep track
of the problematic user and not have to worry about computer issues in
regards to the user accounts or are the hired employees so untrustworthy
that they will start formatting the drive(s). You can always just keep
sensitive material off the PC and with the one or two people that need
this data on an encrypted jump drive with at least 128 bit encryption
but upgrading this to 256+ bit encryption if the data is sensitive or
military in nature.

--
Dan W.

Computer User
  #94  
Old October 5th 06, 12:38 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Dan W.
external usenet poster
 
Posts: 30
Default Microsoft Zero Day security holes being exploited

cquirke (MVP Windows shell/user) wrote:
On Fri, 29 Sep 2006 23:17:02 -0400, "Karl Levinson, mvp"
"cquirke (MVP Windows shell/user)" wrote in


All operating systems do that. They are designed to launch code at boot
time by reading registry values, text files, etc. Because those registry
values are protected from unauthorized access by permissions, someone
would have to already own your system to modify those values, wouldn't they?


The weakness here is that anything that runs during the user's session
is deemed to have been run with the user's intent, and gets the same
rights as the user. This is an inappropriate assumption when there
are so many by-design opportunities for code to run automatically,
whether the user intended to do so or not.

Sure, but the wrong entities come to own systems all the time.


My point is that this one example here doesn't seem to be a vulnerability if
it requires another vulnerability in order to use it.


Many vulnerabilities fall into that category, often because the extra
requirement was originally seen as sufficient mitigation.
Vulnerabilities don't have to fascilitate primary entry to be
significant; they may escalate access after entry, or allow the active
malware state to persist across Windows sessions, etc.

This isn't a case of combining two vulnerabilities to compromise a
system; it's a case of one unnamed vulnerability being used to
compromise a system, and then the attacker performs some other
action, specifically changing registry values.


If this is a vulnerability, then the ability of Administrators to create new
user accounts, change passwords etc. would also be a vulnerability.


OK, now I'm with you, and I agree with you up to a point. I dunno
where the earlier poster got the notion that Winlogin was there to act
as his "ace in the hole" for controlling malware, as was implied.

Defense in depth means planning for how you get your system back; you
don't just faint in shock and horror that you're owned, and destroy
the whole system as the only way to kill the invader.


That's a different issue than the one we were discussing. The statement
was, winlogon using registry values to execute code at boot time is a
vulnerability. I'm arguing that it is not.


I agree with you that it is not - the problem is the difficulty that
the user faces when trying to regain control over malware that is
using Winlogin and similar integration points.

The safety defect is that:
- these integration points are also effective in Safe Mode
- there is no maintenance OS from which they can be managed

We're told we don't need a HD-independent mOS because we have Safe
Mode, ignoring the possibility that Safe Mode's core code may itself
be infected. Playing along with that assertion, we'd expect Safe Mode
to disable any 3rd-party integration, and would provide a UI through
which these integration points can be managed.

But this is not the case - the safety defect is that once software is
permitted to run on the system, the user lacks the tools to regain
control from that software. Couple that with the Windows propensity
to auto-run material either be design or via defects, and you have
what is one of the most common PC management crises around.

Besides, it's a relatively accepted truism that once an attacker has root,
system or administrator privileges on any OS, it is fairly futile to try to
restrict what actions s/he can perform. Anything a good administrator can
do, a bad administrator can undo.


That's a safety flaw right there.

You're prolly thinking from the pro-IT perspective, where users are
literally wage-slaves - the PC is owned by someone else, the time the
user spends on the PC is owned by someone else, and that someone else
expects to override user control over the system.

So we have the notion of "administrators" vs. "users". Then you'd
need a single administrator to be able to manage multiple PCs without
having to actually waddle over to all those keyboards - so you design
in backdoors to facilitate administration via the network.

Which is fine - in the un-free world of mass business computing.

But the home user owns thier PCs, and there is no-one else who should
have the right to usurp that control. Creditors and police do not
have the right to break in, search, or sieze within the user's home.

So what happens when an OS designed for wage-slavery is dropped into
free homes as-is? Who is the notional "administrator"? Why is the
Internet treated as if it were a closed and professionally-secured
network? There's no "good administratrors" and "bad administrators"
here; just the person at the keyboard who should have full control
over the system, and other nebulous entities on the Internet who
should have zero control over the system.

Whatever some automated process or network visitationb has done to a
system, the home user at the keyboard should be able to undo.

Windows XP Home is simply not designed for free users to assert thier
rights of ownership, and that's a problem deeper than bits and bytes.



------------------ ----- ---- --- -- - - - -

The rights you save may be your own
------------------ ----- ---- --- -- - - - -


Exactly, Chris and again well said and a great reason for the Windows
Classic Edition which I have started to work on because I can no longer
depend only on Microsoft. I actually plan to try and get a somewhat
decent copy with tri-mode (9x, NT (New Technology) and open source)
solutions to present to Redmond, Washington when Microsoft decides --
hmm --- this might be good after all and realizes their folly in trying
to fully eliminate the awesome 9s source code in face of the
fundamentally flawed NT source code to start with. The issue is this
and if you examine both source codes at the core levels with all
features and functionality stripped away to as raw a code as possible
then you will see the inherent weakness of NT and how it is like the
foolish man who built his house upon sand in the Bible compared to the
9x source code at its most raw and basic form and you will see it being
like the Wise Man who built his house upon the Rock in the Bible.
Forgive me for bringing religion into it but this was the best analogy
that I could think up and I appreciate your understanding.

--
Dan W.

Computer User
  #95  
Old October 5th 06, 01:32 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Paul Adare
external usenet poster
 
Posts: 7
Default Microsoft Zero Day security holes being exploited

In article , in the
microsoft.public.security news group, Dan W. says...

Wow, I will need to check out Bart from your website to read up on it.


Please do us a favour, if all you're going to do is to post a single
line reply, then don't quote the entire message thread to which you're
responding. Remove all but what is really relevant and necessary.

--
Paul Adare - MVP Virtual Machines
Waiting for a bus is about as thrilling as fishing,
with the similar tantalisation that something,
sometime, somehow, will turn up. George Courtauld

  #96  
Old October 5th 06, 01:39 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Paul Adare
external usenet poster
 
Posts: 7
Default Microsoft Zero Day security holes being exploited

In article , in the
microsoft.public.security news group, Dan W. says...

I actually plan to try and get a somewhat
decent copy with tri-mode (9x, NT (New Technology) and open source)
solutions to present to Redmond, Washington when Microsoft decides --
hmm --- this might be good after all and realizes their folly in trying
to fully eliminate the awesome 9s source code in face of the
fundamentally flawed NT source code to start with. The issue is this
and if you examine both source codes at the core levels with all
features and functionality stripped away to as raw a code as possible
then you will see the inherent weakness of NT


You have access to the source code that allows you to make the above
statement?
Let me know when you get that meeting with Microsoft scheduled will
you? I'd like to be present when you attempt to explain how Windows 9x
is more secure than is NT. I always appreciate a good laugh.

--
Paul Adare - MVP Virtual Machines
Waiting for a bus is about as thrilling as fishing,
with the similar tantalisation that something,
sometime, somehow, will turn up. George Courtauld

  #97  
Old October 7th 06, 12:23 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
cquirke (MVP Windows shell/user)
external usenet poster
 
Posts: 274
Default Microsoft Zero Day security holes being exploited

On Sun, 01 Oct 2006 20:45:23 -0600, "Dan W." wrote:
karl levinson, mvp wrote:
"Dan W." wrote in message


Fewer vulnerabilities are being reported for Windows 98 because Windows 98
is old and less commonly used, and vulns found for it get you less fame


More to the point is that vulnerable surfaces are less-often exposed
to clickless attack - that's really what makes Win9x safer.

You can use an email app that displays only message text, without any
inline content such as graphics etc. so that JPG and WMF exploit
surfaces are less exposed. Couple that with an OS that doesn't wave
RPC, LSASS etc. at the 'net and doesn't grope material underfoot
(indexing) or when folders are viewed ("View As Web Page" and other
metadata handlers) and you're getting somewhere.

For those who cannot subscribe to the "keep getting those patches,
folks!" model, the above makes a lot of sense.

Didn't XP expand on and improve the system restore feature to a level not
currently in 98 or ME?


There's no SR in Win98, tho that was prolly when the first 3rd-party
SR-like utilities started to appear. I remember two of these that
seemed to inform WinME-era SR design.

No-one seemed that interested in adding these utilities, yet when the
same functionality was built into WinME, it was touted as reason to
switch to 'ME, and when this functionality fell over, users were often
advised to "just" re-install to regain it. I doubt if we'd have
advised users to "just" re-install the OS so that some 3rd-party
add-on could work again.

XP's SR certainly is massively improved over WinME - and there's so
little in common between them that it's rare one can offer SR
management or tshooting advice that applies to both OSs equally.

I use SR in XP, and kill it at birth in WinME - that's the size of the
difference, though a one-lunger (one big doomed C installation may
find the downsides of WinME's SR to less of an issue.

about Microsoft and its early days to present time. The early Microsoft
software engineers nicknamed it the Not There code since it did not have
the type of maintenance operating system that Chris Quirke, MVP fondly
talks about in regards to 98 Second Edition.


If the MOS being discussed for Win 98 is the system boot disk floppy, that
was a very basic MOS and it still works on Windows XP just as well as it
ever did on Windows 98. [Sure, you either have to format your disk as FAT,
or use a third party DOS NTFS driver.]


That was true, until we crossed the 137G limit (where DOS mode is no
longer safe). It's a major reason why I still avoid NTFS... Bart
works so well as a mOS for malware management that I seldom use DOS
mode for that in XP systems, but data recovery and manual file system
maintenance remain seriously limited for NTFS.

I think Chris really wants not that kind of MOS but a much bigger and
better one that has never existed.


Well, ever onward and all that ;-)

Bart is a bigger and better mOS, though it depends on how you build it
(and yes, the effort of building it is larger than for DOS mode
solutions). You can build a mOS from Bart that breaks various mOS
safety rules (e.g. falls through to boot HD on unattended reset,
automatically writes to HD, uses Explorer as shell and thus opens the
risk of malware exploiting its surfaces, etc.).

I'm hoping MS WinPE 2.0, or the subset of this that is built into the
Vista installation DVD, will match what Bart offers. Initial testing
suggests it has the potential, though some mOS safety rules have been
broken (e.g. fall-through to HD boot, requires visible Vista
installation to work, etc.).

The RAM testing component is nice but breaks so many mOS safety rules
so badly that I consider it unfit for use:
- spontaneous reset will reboot the HD
- HD is examined for Vista installation before you reach the test
- a large amount of UI code required to reach the test
- test drops the RAM tester on HD for next boot (!!)
- test logs results to the HD (!!)
- you have to boot full Vista off HD to see the results (!!!)

What this screams to me, is that MS still doesn't "get" what a mOS is,
or how it should be designed. I can understand this, as MS WinPE was
originally intended purely for setting up brand-new, presumed-good
hardware with a fresh (destructive) OS installation.

By default, the RAM test does only one or a few passes; it takes under
an hour or so - and thus is only going to detect pretty grossly-bad
RAM. Grossly bad RAM is unlikely to run an entire GUI reliably, and
can bit-lip any address to the wrong one, or any "read HD" call to a
"write HD" call. The more code you run, the higher the risk of data
corruption, and NO writes to HD should ever be done while the RAM is
suspected to be bad (which is after all why we are testing it.

A mOS boot should never automatically chain to HD boot after a time
out, because the reason you'd be using a mOS in the first place is
because you daren't boot the HD. So when the mOS disk boots, the only
safe thing to do is quickly reach a menu via a minimum of code, and
stop there, with no-time-out fall-through.

It's tempting to fall-through to the RAM test as the only safe option,
but that can undermine unattended RAM testing - if the system
spontaneously resets during such testing, you need to know that, and
it's not obvious if the reboot restarts the RAM test again.

Until RAM, physical HD and logical file system are known to be safe,
and it's known that deleted material is not needed to be recovered, it
is not safe to write to any HD. That means no page file, no swap, and
no "drop and reboot" methods of restarting particular tests.

Until the HD's contents are known to be malware-free, it is unsafe to
run any code off the HD. This goes beyond not booting the HD, or
looking for drivers on the HD; it also means not automatically groping
material there (e.g. when listing files in a folder) as doing so opens
up internal surfaces of the mOS to exploitation risks.


Karl's right, tho... I'm already thinking beyond regaining what we
lost when hardware ( 137G, USB, etc.) and NTFS broke the ability to
use DOS mode as a mOS, to what a purpose-built mOS could offer.

For example, it could contain a generic file and redirected-registry
scanning engine into which av vendor's scanning modules could be
plugged. It could offer a single UI to manage these (i.e. "scan all
files", "don't automatically clean" etc.) and could collate the
results into a single log. It could improve efficiency by applying
each engine in turn to material that is read once, rather than the
norm of having each av scanner pull up the material to scan.

MS could be accused of foreclosing opportunities to av vendors
(blocking kernel access, competing One Care and Defender products),
but this sort of mOS design could open up new opportunities.

Normally, the av market is "dead man's shoes"; a system can have only
one resident scanner, so the race is on to be that scanner (e.g. OEM
bundling deals that reduce per-license revenue). Once users have an
av, it becomes very difficult to get them to switch - they can't try
out an alternate av without uninstalling what they have, and no-one
wants to do that. It's only when feeware av "dies" at the end of a
subscription period, that the user will consider a switch.

But a multi-av mOS allows av vendors to have their engines compared,
at a fairly low development cost. They don't have to create any UI at
all, because the mOS does that; all they have to do is provide a pure
detection and cleaning engine, which is their core compitency anyway.

Chances are, some av vendors would prefer to avoid that challenge :-)

XP also comes with a number of restore features such as Recovery
Console and the Install CD Repair features.


They are good few-trick ponies, but they do not constitute a mOS.
They can't run arbitrary apps, so they aren't an OS, and if they
aren't an OS, then by definition that aren't a mOS either.

As it is, RC is crippled as a "recovery" environment, because it can't
access anything other than C: and can't write to anywhere else. Even
before you realise you'd have to copy files off one at a time (no
wildcards, no subtree copy), this kills any data recovery prospects.

At best, RC and OS installation options can be considered "vendor
support obligation" tools, i.e. they assist MS in getting MS's
products working again. Your data is completely irrelevant.

It gets worse; MS accepts crippled OEM OS licensing as being "Genuine"
(i.e. MS got paid) even if they provide NONE of that functionality.

The driver's not even in the car, let alone asleep at the wheel :-(

I never use those or find them very useful for security, but they're
way more functional and closer to an MOS than the Win98 recovery
floppy or anything Win98 ever had. 98 never had a registry
editor or a way to modify services like the XP Recovery Console.


They do different things.

RC and installation options can regain bootability and OS
functionality, and if you have enabled Set commands before the crisis
you are trying to manage, you can copy off files one at a time. They
are limited to that, as no additional programs can be run.

In contrast, a Win98EBD is an OS, and can run other programs from
diskette, RAM disk or CDR. Such programs include Regedit
(non-interactive, i.e. import/export .REG only), Scandisk (interactive
file system repair, which NTFS still lacks), Odi's LFN tools (copy off
files in bulk, preserving LFNs), Disk Edit (manually repair or
re-create file system structure) and run a number of av.

So while XP's tools are bound to getting XP running again, Win98EBD
functionality encompasses data recovery, malware cleanup, and hardware
diagnostics. It's a no-brainer as to which I'd want (both!)

that at the bare bones level the source code of 9x is more secure


It depends on what you consider security.


That's the point I keep trying to make - what Dan refers to is what
I'd call "safety", whereas what Karl's referring to is what I'd call
"security". Security rests on safety, because the benefit of
restricting access to the right users is undermined if what happens is
not limited to what these users intended to happen.

Win98 was always crashing and unstable,


Er... no, not really. That hasn't been my mileage with any Win9x,
compared to Win3.yuk - and as usual, YMMV based on what your hardware
standards are, and how you set up the system. I do find XP more
stable, as I'd expect, given NT's greater protection for hardware.

because there was no protection of memory space from bad apps or
bad attackers.


Mmmh... AFAIK, that sort of protection has been there since Win3.1 at
least (specifically, the "386 Enhanced" mode of Win3.x). Even DOS
used different memory segments for code and data, though it didn't use
386 design to police this separation.

IOW, the promise that "an app can crash, and all that happens is that
app is terminated, the rest of the OS keeps running!" has been made
for every version of Windows since Win3.x - it's just that the reality
always falls short of the promise. It still does, though it gets a
little closer every time.

If anything, there seems to be a back-track on the concept of data vs.
code separation, and this may be a consequence of the
Object-Orientated model. Before, you'd load some monolithic program
into its code segment, which would then load data into a separate data
segment. Now you have multiple objects, each of which can contain
thier own variables (properties) and code (methods).

We're running after the horse by band-aiding CPU-based No-Execute
trapping, so that when (not if) our current software design allows
"data" to spew over into code space, we can catch it.

Microsoft's security problems have largely been because of backwards
compatibility with Windows 9x, DOS and Windows NT 4.0. They feel, and I
agree, that Microsoft security would be a lot better if they could abandon
that backwards compatibility with very old niche software, as they have been
doing gradually.


The real millstone was Win3.yuk (think heaps, co-operative
multitasking). Ironically, DOS apps multitask better than Win16 ones,
as each DOS app lives in its own VM and is pre-emptively multi-tasked.

64-bit is the opportunity to make new rules, as Vista is doing (e.g.
no intrusions into kernel allowed). I'm hoping that this will be as
beneficial as hardware virtualization was for NT.

Win9x apps don't cast as much of a shadow, as after all, Win9x's
native application code was to be the same as NT's. What is a
challenge is getting vendors to conform to reduced user rights, as up
until XP, they could simply ignore this.

There's also the burden of legacy integration points, from
Autoexec.bat through Win.ini through the various fads and fashions of
Win9x and NT and beyond. There's something seriously wrong if MS is
unable to enumerate every single integration point, and provide a
super-MSConfig to manage them all from a single UI.

Classic Edition could be completely compatible with the older software
such as Windows 3.1 programs and DOS programs. Heck, Microsoft
could do this in a heartbeat without too much trouble.


Think about that. Who sits in exactly the same job for 12 years?

All the coders who actually made Win95, aren't front-line coders at MS
anymore. They've either left, or they've climbed the ladder into
other types of job, such as division managers, software architects
etc. To the folks who are currently front-line coders, making Vista
etc., Win9x is as alien as (say) Linux or OS/2.

To build a new Win9x, MS would have to re-train a number of new
coders, which would take ages, and then they'd have to keep this
skills pool alive as long as the new Win9x were in use. I don't see
them wanting to do that, especially as they had such a battle to
sunset Win9x and move everyone over to NT (XP) in the first place.

Also, think about what you want from Win9x - you may find that what
you really want is a set of attributes that are not inherently unique
to Win9x at all, and which may be present in (say) embedded XP.


If you really do need the ability to run DOS and Win3.yuk apps, then
you'd be better served by an emulator for these OSs.

This not only protects the rest of the system to the oddball
activities of these platforms, but can also virtualize incompatible
hardware and mimic the expected slower clock speeds more smoothly than
direct execution could offer. This is important, as unexpected speed
and disparity between instruction times is as much a reason for old
software to fail on new systems as changes within Windows itself.

I will do what it takes to see this come to reality.


Stick around on this, even if there's no further Win9x as such. As we
can see from MS's first mOS since Win98 and WinME EBDs, there's more
to doing this than the ability to write working code - there has to be
an understanding of what the code should do in the "real world".


-- Risk Management is the clue that asks:

"Why do I keep open buckets of petrol next to all the
ashtrays in the lounge, when I don't even have a car?"
----------------------- ------ ---- --- -- - - - -

  #98  
Old October 11th 06, 07:41 AM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
imhotep
external usenet poster
 
Posts: 155
Default Microsoft Zero Day security holes being exploited

Paul Adare wrote:

In article , in the
microsoft.public.security news group, Dan W. says...

I actually plan to try and get a somewhat
decent copy with tri-mode (9x, NT (New Technology) and open source)
solutions to present to Redmond, Washington when Microsoft decides --
hmm --- this might be good after all and realizes their folly in trying
to fully eliminate the awesome 9s source code in face of the
fundamentally flawed NT source code to start with. The issue is this
and if you examine both source codes at the core levels with all
features and functionality stripped away to as raw a code as possible
then you will see the inherent weakness of NT


You have access to the source code that allows you to make the above
statement?
Let me know when you get that meeting with Microsoft scheduled will
you? I'd like to be present when you attempt to explain how Windows 9x
is more secure than is NT. I always appreciate a good laugh.


No reason to be sarcastic....

Im
  #99  
Old October 11th 06, 10:27 PM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
Stephen Howe
external usenet poster
 
Posts: 8
Default Microsoft Zero Day security holes being exploited


Exactly, Chris and again well said and a great reason for the Windows
Classic Edition which I have started to work on because I can no longer
depend only on Microsoft. I actually plan to try and get a somewhat
decent copy with tri-mode (9x, NT (New Technology) and open source)
solutions to present to Redmond, Washington when Microsoft decides --
hmm --- this might be good after all and realizes their folly in trying
to fully eliminate the awesome 9s source code in face of the
fundamentally flawed NT source code to start with.


Oh give over.
You reveal by this that you are not a programmer.

Windows 9x is based on Windows 3.11.
It is not fully 32-bit, it is a hybrid OS with parts 16-bit, parts 32-bit.
You can see that by examining the size of the 3 main Windows files:
32-bit versions:
GDI32.EXE
USER32.EXE
KERNEL32.DLL
versus the 16-bit versions
GDI.EXE
USER.EXE
KRNL386.EXE
The direction of the thunking can seen by comparing like-with-like and you
can see at once that Microsoft's conversion to 32-bits was incomplete (I
think IBM's success selling OS/2 2.0 forced Microsofts hand).
Memory was provided by a DPMI host which in turn was supplied by HIMEM.SYS.
Processes were not properly insulated from each other.

In constrast, Windows NT is fully 32-bit, written from the ground up with
Dave Cutler in control.
He wrote the OS VMS for DEC. Yes there might be bugs in NT 3.1, NT 3.5, NT
4.0, 2000 but the underlying architecture is sound, certainly much better
than the Windows 9x line.

See
http://en.wikipedia.org/wiki/Microso...rating_systems

Stephen Howe


  #100  
Old October 12th 06, 06:29 AM posted to microsoft.public.security,microsoft.public.windowsxp.security_admin,microsoft.public.security.homeusers,microsoft.public.internetexplorer.security
imhotep
external usenet poster
 
Posts: 155
Default Microsoft Zero Day security holes being exploited

"Stephen Howe" stephenPOINThoweATtns-globalPOINTcom wrote:


Exactly, Chris and again well said and a great reason for the Windows
Classic Edition which I have started to work on because I can no longer
depend only on Microsoft. I actually plan to try and get a somewhat
decent copy with tri-mode (9x, NT (New Technology) and open source)
solutions to present to Redmond, Washington when Microsoft decides --
hmm --- this might be good after all and realizes their folly in trying
to fully eliminate the awesome 9s source code in face of the
fundamentally flawed NT source code to start with.


Oh give over.
You reveal by this that you are not a programmer.

Windows 9x is based on Windows 3.11.
It is not fully 32-bit, it is a hybrid OS with parts 16-bit, parts 32-bit.
You can see that by examining the size of the 3 main Windows files:
32-bit versions:
GDI32.EXE
USER32.EXE
KERNEL32.DLL
versus the 16-bit versions
GDI.EXE
USER.EXE
KRNL386.EXE
The direction of the thunking can seen by comparing like-with-like and you
can see at once that Microsoft's conversion to 32-bits was incomplete (I
think IBM's success selling OS/2 2.0 forced Microsofts hand).
Memory was provided by a DPMI host which in turn was supplied by
HIMEM.SYS. Processes were not properly insulated from each other.



Yes, Microsoft at the time got caught in a lie as they were saying that
Windows98 was 32 bit. Microsoft lying? Oh my, what is the dam World coming
to!!!


In constrast, Windows NT is fully 32-bit, written from the ground up with
Dave Cutler in control.
He wrote the OS VMS for DEC. Yes there might be bugs in NT 3.1, NT 3.5, NT
4.0, 2000 but the underlying architecture is sound, certainly much better
than the Windows 9x line.


Well, Dave did but, VMS always was a better OS....still is.

See

http://en.wikipedia.org/wiki/Microso...rating_systems

Stephen Howe


  #101  
Old October 12th 06, 06:36 AM posted to microsoft.public.security,microsoft.public.security.homeusers,microsoft.public.windowsxp.security_admin
imhotep
external usenet poster
 
Posts: 155
Default Microsoft Zero Day security holes being exploited

cquirke (MVP Windows shell/user) wrote:

On Sun, 24 Sep 2006 16:06:41 -0400, imhotep
Karl Levinson, mvp wrote:
"imhotep" wrote in message


It really make my blood boil knowing that they patched the DRM security
hole in a couple of days, yet I am sure by the time this patch comes
out a crap load of people will get infected...

I assure you, a crap load of people will NOT be infected by this or any
other IE vuln in the future. IE vulns just don't do that.


cough

See http://cquirke.mvps.org/9x/mimehole.htm, Google( BadTrans B )

That's a very old bug, long fixed unless you "just" re-installed any
OS predating XP, as every such OS uses an IE that is both exploitable
and too "old" to be patched other than by upgrading the whole of IE.

If you read up that bug, you'd see how the nature of exploits and code
bugs have changed.

The MIME hole was a design safety failure, not a code defect - IOW, it
"worked as designed" but the design was brain-dead.

There's still design failures in Windows, and until these are proven
to be exploitable, they won't be patched because "it's working the way
we expected it to". Most exploits that are being patched today are
genuine code defects, and may be harder to exploit.

Then again, the modern malware industry is optimised to overcome any
"an attacker would have to..." mitigations. Once an exploit shape is
found, the source code becomes rapidly available, and malware coders
then drop it straight into attack vehicles that are ready to roll;
either full-fledged multi-function bots, or simple stubs that can pull
down the "real" malware. If these malware haven't been released
before, av won't "know" them at the signature level.


Malware can always out-turn patching. The attacks are smaller than
the patches and can drown out the patching process by sheer volume,
even before you consider DDoSing the fewer number of patching sources
or poisoning the patch pool via fake patching sources.

The other reason malware will always win the race is that the required
software standards are far lower. A malware has to work on some PCs,
and it doesn't matter if it trashes others. But a patch has to work
on all systerms, and not cause new problems of any of them.

If you insist on butting heads with malware on a level playing field,
you will always lose. Better to tilt the playing field so that the
user at the keyboard has the ability to trump all code and remote
access - but MS's fixation on centrally-managed IT and DRM undermines
this and rots the top of the Trust Stack.

See http://cquirke.blogspot.com/2006/08/trust-stack.html

Then how do you explain the record breaking time to patch Microsoft's DRM
hole? Three days to patch? Please explain (no propaganda necessary).


Well, it could be that the nature of the hole was trivial to fix -
e.g. simply changing some static "secret" info that became harvested
and used by the attackers. I suspect this is the case, given how
quickly the fix has been circumvented by the attackers.

We have a very small sample from which to draw conclusions. Sure, we
have a lot of defects that allow user interests to be attacked, and we
have a smaller number where users were left hanging out to try while
patching caught up with ITW exploits. But we have a sample of 1
prompt DRM fix, and it may just happen to have been an easy one to
fix; maybe the next one (or even the continuation of this one) will
take a far longer time to fix. If so, don't expect to read about it!

So... are you saying that all fixes should be held back the same
amount of time, even if they are ready earlier, so that MS can be seen
to act more promptly on the issues we'd most like to see fixed first?


BTW, the post I'm replying to has a ?bug of its own; the sole
newsgroup set for replies is not found to exist on my news server.
Maybe it exists on other news servers, who knows? Here, it's 100%
broken and buggy. Should I wave this around as "proof" that the
poster I'm replying to is trying to hide refutations to his post?


No. As the OP the Followup-to was set
microsoft.public.internetexplorer.security which is a valid newsgroup.
Although, I should have set it to this newsgroup
microsoft.public.security...

Im




------------ ----- --- -- - - - -

Drugs are usually safe. Inject? (Y/n)
------------ ----- --- -- - - - -




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off






All times are GMT +1. The time now is 01:11 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.