A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Windows 10 » Windows 10 Help Forum
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Defrag



 
 
Thread Tools Rate Thread Display Modes
  #31  
Old August 21st 17, 06:16 PM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Defrag

Stef wrote:

But defragging is an inconvienence at the very least. But with the
transition to SSDs I guess fragmentation no longer matters and
defragging no longer necessary.


I thought this was the case too, but there's an explanation
that says an SSD may be defragmented for performance reasons
unrelated to "zero seek time". Someone discovered the
SSD can be slow during copy-on-write, like during VSS snapshots.
(That's where the filesystem is frozen so you can make a
backup, and new writes to the filesystem are handled by
VSS so they don't affect the snapshot.)

https://www.hanselman.com/blog/TheRe...YourSS D.aspx

"Windows does sometimes defragment SSDs, yes"

"necessary due to slow volsnap copy-on-write"

"If an SSD gets too fragmented you can hit maximum
file fragmentation (when the metadata can’t represent any more
file fragments) which will result in errors when you try to
write/extend a file."

I wasn't aware the $MFT extension mechanism has limits.
But that article implies it does. When a single $MFT entry cannot
hold any more LBA entries for the fragments in a file,
a new entry in the $MFT (with the *same* filename) is
created, and more LBA references are written in there. I've
found a file on C: which was extended 30 times. And I
don't know what controls that, or whether a record somewhere
else references all the extensions (all 30 or more of them).
Searching the entire $MFT linearly, looking for filenames
doesn't make sense, so maybe they have another way of
referencing the extensions made.

Paul
Ads
  #32  
Old August 21st 17, 09:10 PM posted to alt.comp.os.windows-10,alt.computer.workshop
David B.[_5_]
external usenet poster
 
Posts: 545
Default Defrag

On 21/08/2017 18:13, Char Jackson claims ....

Fragmentation is not an inconvenience for me. The last time I manually
defragged one of my own systems was at least 15 years ago. I don't even
own any of those systems anymore. None of my current PCs have ever been
manually defragged. If I check the fragmentation percentage, it's always
around 2-3%, which I don't think anyone would consider to be a problem.

Two things contribute to that, I think. Windows does a rudimentary
defrag on a scheduled basis, and I don't use anything smaller than a 2TB
drive (spinner) or a 512GB drive (SSD). With plenty of open drive space,
there's very little opportunity for files to be fragmented.


I'm guessing that you are *NOT* an 'everyday' computer user!

Is that right?

--
“Men occasionally stumble over the truth, but most of them pick
themselves up and hurry off as if nothing ever happened.” (Winston S.
Churchill)
  #33  
Old August 22nd 17, 06:29 PM posted to alt.comp.os.windows-10
Stef
external usenet poster
 
Posts: 364
Default Defrag

On 21/8/2017 10:16, Paul wrote:

Stef wrote:

But defragging is an inconvienence at the very least. But with the
transition to SSDs I guess fragmentation no longer matters and
defragging no longer necessary.


I thought this was the case too, but there's an explanation
that says an SSD may be defragmented for performance reasons
unrelated to "zero seek time". Someone discovered the
SSD can be slow during copy-on-write, like during VSS snapshots.
(That's where the filesystem is frozen so you can make a
backup, and new writes to the filesystem are handled by
VSS so they don't affect the snapshot.)

https://www.hanselman.com/blog/TheRe...YourSS D.aspx

"Windows does sometimes defragment SSDs, yes"

"necessary due to slow volsnap copy-on-write"

"If an SSD gets too fragmented you can hit maximum
file fragmentation (when the metadata can˘t represent any more
file fragments) which will result in errors when you try to
write/extend a file."

I wasn't aware the $MFT extension mechanism has limits.
But that article implies it does. When a single $MFT entry cannot
hold any more LBA entries for the fragments in a file,
a new entry in the $MFT (with the *same* filename) is
created, and more LBA references are written in there. I've
found a file on C: which was extended 30 times. And I
don't know what controls that, or whether a record somewhere
else references all the extensions (all 30 or more of them).
Searching the entire $MFT linearly, looking for filenames
doesn't make sense, so maybe they have another way of
referencing the extensions made.


More reasons for a file system for Windows that doesn't require
defragging.

Thanks for that link. I'll read the article later.

Stef

  #34  
Old August 22nd 17, 08:01 PM posted to alt.comp.os.windows-10
Stef
external usenet poster
 
Posts: 364
Default Defrag

On 21/8/2017 10:13, Char Jackson wrote:

On Mon, 21 Aug 2017 15:25:44 +0000 (UTC), Stef
wrote:

But defragging is an inconvienence at the very least.


Fragmentation is not an inconvenience for me. The last time I manually
defragged one of my own systems was at least 15 years ago. I don't even
own any of those systems anymore. None of my current PCs have ever been
manually defragged. If I check the fragmentation percentage, it's always
around 2-3%, which I don't think anyone would consider to be a problem.


Then you're one of the lucky ones.

And whether done manually or automatically, defragging still uses system
resources, and if you happen to be using the system at the same time
it will probably be sluggish. And if accessing the volume that is
being defragged, you may not be able to, or it will be slow if you
can, and there is the chance of a fault and you loose files, data or
the partition gets corrupted. That's why if a defrag is in progress, I
leave the system alone until it's done. Just to be safe.

FWIW: I have a very leaned out XP SP3 install running in Virtualbox on a
Linux host. It is used on a regular basis and is only connected to the
Internet once or twice a year for maintenance. The 20GB virtual HD
(11GB free) has never been defragged, manually or otherwise. Volume
fragmentation as of Monday, the 21st is 5%. File 10%. But the type of
work I do on it doesn't normally fragment files.

Two things contribute to that, I think. Windows does a rudimentary
defrag on a scheduled basis, and I don't use anything smaller than a 2TB
drive (spinner) or a 512GB drive (SSD). With plenty of open drive space,
there's very little opportunity for files to be fragmented.


Yes, ample free space on the hard drive is an advantage. It also makes
your volume fragmentation percentage lower.

But with the
transition to SSDs I guess fragmentation no longer matters and
defragging no longer necessary.


Manual defragging hasn't been necessary for well over a decade, but the
background housekeeping that Windows does may apply to SSDs, after all.
I think Paul recently quoted from an article that talked about that.


But on Windows defragging is still necessary whether it's automatic or
manual. And the whole point of this thread is that NTFS is OLD (25
years!) technology that is past due for replacement. MS did release
ReFS to address NTFS' shortcomings, but ReFS still needs defragging just
not as frequently -- as far as I can determine.

But there are still millions of
spinning drives out there, and will be for a long, long time. NTFS
lacks the improvements of today's modern file systems particularly
speed, reliability, and file integrity. Its basic design just does not
permit those features to be added to it. Hence ReFS as a stop-gap
"fix." It's time NTFS was replaced.

Just consider the time wasted by HAVING to defrag to keep HD performance
optimum and the real possibility of files being lost or damaged.
Wouldn't it be better not to have to?


I don't think we've had to for about 15 years.


You mean manually? Just because you yourself aren't defragging doesn't
mean it isn't being done and necessarily so. So, your statement is
misleading. And if your system is turned off -- most people don't leave
their machines running 24/7 -- when the defrag is scheduled, it
doesn't get done.

Until you've used a file system that doesn't need defragging,
etc., you'll never realize the convienence.


I feel like I'm at least 15 years ahead of you. ;-)


How so? You're using a file system that's 25 years old and it shows.
I guess if all you've ever driven is a '52 Buick, you don't know how
much better a contemporary one is. ;-)

I have what you want, I've had it for a very long time, and I wonder why
you don't also have it.


You're deluded. Amazingly so, I'm sorry to say.

Stef

  #35  
Old August 22nd 17, 08:19 PM posted to alt.comp.os.windows-10
Rene Lamontagne
external usenet poster
 
Posts: 2,549
Default Defrag

On 8/20/2017 1:45 PM, Stef wrote:
On 18/8/2017 09:20, Paul wrote:

Stef wrote:
On 18/8/2017 01:25, Davidm wrote:

On Fri, 18 Aug 2017 07:13:26 -0000 (UTC), wg_2002
wrote:

On Fri, 18 Aug 2017 07:38:55 +0100, Martin Edwards wrote:

My defragmenter played up yesterday, but it worked this morning. I have
the icons for Disc Cleanup and Defragmenter on my desktop as I started
with W7. I now have W10 which does not let you get at the programs. How
will I get them if I get a new computer?
Open file explorer and right click on your c: drive then click on
properties at the bottom. Under the tools tab click optimize and that
will bring up the defragmenter/trim gui.
Doesn't W10 defrag in the background anyway, no need to run it
manually?

That's the default, but if your computer is turned off when defrag is
scheduled to run . . .

Isn't it about time MS released a filesystem that doesn't need
defragging? Other OSes have them. NTFS is a quarter of a century old for
god's sake! Even with all those patches installed over the years, its
foundation is still ancient technology. Time to replace it with
something better. Or enable Windows so it can use other filesystems.

Stef


I think the current design of Windows 10, handles
the situation with intelligence. Regularly scheduled
defrags keep the "percentage" number low.

They have also added something (special handling) for certain
problem files. And this seems to have been added on the later
versions of Win10.

But I have ways of creating a mess. My "100% fragmented"
volume, a small volume by the way, was the result of doing
a chromium build. And just for fun, I had the defrag.exe take
a crack at that. Well, it needed a little help, because
I wasn't meeting the minimum white space requirement. There
was no place to move the files.

In average usage, stuff like this doesn't happen. If I'd done
the software build on a regular hard drive, with lots of slack, there
would have been no issue at all. But I did the build on a RAMdisk
which was barely big enough for the job. The fragmentation makes
no difference to a RAMDisk, but I thought it would be fun to
see whether the defragmenter utility could handle a pretty severe mess.
Once I'd helped it out, it worked fine. But it couldn't get past
the first bit. Its heuristic wasn't good enough.

Microsoft has another file system. This is newer than NTFS.

https://en.wikipedia.org/wiki/Refs

"In addition, Windows cannot be booted from a ReFS volume."

So apparently, that's not what it is for. And I cannot tell from
that article, whether fragmentation remains an issue or not.


Ah! Yes. The Ol' Resilient File System. Read about it. Seems designed
more for servers with RAID, multiple hard drives and partitions than
single drive/single partition consumer machines. Rumored to be a
future replacement for NTFS, but that was in 2012 when it was
released with MS Server 2012, and as you said you can't boot the OS from
it. So, you still need NTFS. So, NOT a replacement, but an addendum.

And as far as defragging: In all the reading I've done no where is it
explicitly stated that it doesn't need it. So I'll surmise it does just
like NTFS. (Except on SSDs, of course) Because if it didn't, Microsoft
would be shouting that fact from the highest roof tops all over the
world.

So, don't expect a full NTFS, no-defragging-needed, replacement file
system any time soon. When or if it happens, it will be a day late and a
dollar short. Such is the way of Microsoft. For now, you'll just have to
be satisfied humming along in that utilitarian '52 Buick with the holes
in the sides and Hydra-Matic transmission. ;-)


Stef

Hi Stef, Just to further your automotive knowledge, The 1952 Buick ran a
DynaFlow transmission, It was so slow on acceleration that we called it
a DipaFlow, The side holes where originaly engine compartment vents but
where later plugged.
From an old mechanic of the 50s and 60s, Now retired Power
Egineer. :-)

Regards, Rene

  #36  
Old August 22nd 17, 09:14 PM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Defrag

Stef wrote:

there is the chance of a fault and you loose files, data or
the partition gets corrupted. That's why if a defrag is in progress, I
leave the system alone until it's done. Just to be safe.


Actually, no. Microsoft provides an API specifically for
defragmentation. It's also used in some cases, for block
moves by various tools. The design is in fact, intended
to be "safe". Microsoft didn't write that originally,
and it was inherited from the contracting of the defrag
program. (The defrag in WinXP was written by a third-party.)
Now, that particular API is available to *all* third parties
wishing to defrag safely. The JKDefrag program for example,
the author can say "it's safe", because he calls only
the MS API to make changes to your disk.

They only way you might screw things up, is if defragmenting
a *damaged* file system.

An example of a damaged file system, would be one on WinXP,
where regular CHKDSK is not done. It's up to the user to
verify partitions there. Some users will never run CHKDSK,
and latent faults could accumulate in the file system.
Typically this happens, just before the disk is about
to die :-)

The latest Windows OSes, have a continuous CHKDSK process
they run. This reduces the odds of a file system being
damaged. So when you fire off a defrag, you can be
fairly confident there aren't any latent faults
in the file system being defragmented. This is better
than running fsck every hundred days :-)

And you can see this, when you run CHKDSK from the GUI.
(Right-click the partition in Win10, do Properties. Then
Tools, then Error Checking.) I've not seen a really detailed
article, describing what the background consistency checks
are, but Microsoft is confident enough, t be using the
language in this dialog box.

https://s29.postimg.org/j5gutzi9j/br..._interface.gif

The interface will put up a dialog to the effect "we haven't
seen any problems, but you can scan this anyway", and then
it allows the CHKDSK scan to proceed. The reason for the
snotty dialog, is to point out that a background maintenance
thing is running on that partition, to ensure there won't
be surprises.

If you do that on WinXP, there won't be that dialog, because
they *haven't* been running CHKDSK on your behalf. It's up
to you to CHKDSK/fsck every once in a while. And especially
if you did something like power-off in mid session. While the
Dirty Bit could be set per partition, it isn't always, and
I don't take chances here after power events, and I scan
everything.

The defragmenter can safely run, at the same time as the partition
is doing heavy I/O. The user would be an idiot, if attempting
that though. As the defragmenter cannot make much progress,
if thrashing (mixing) with other I/O present. File integrity
is still there, but no benefit would be gained from mixing
the two activities. Defrag is scheduled to run at night, but
nothing prevents Microsoft's ham-fisted maintenance activities,
from causing high I/O while the defrag runs (there could be
a three-hour Search Indexer operation in-flight, and I don't think
Windows Update really "waits" for any other process to stop).
I see all sorts of edge cases in Win10, where a better "order"
to maintenance would have worked. I only see this stuff,
because of the annoyance it causes.

Paul
  #37  
Old August 22nd 17, 09:40 PM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Defrag

Rene Lamontagne wrote:
On 8/20/2017 1:45 PM, Stef wrote:
On 18/8/2017 09:20, Paul wrote:

Stef wrote:
On 18/8/2017 01:25, Davidm wrote:

On Fri, 18 Aug 2017 07:13:26 -0000 (UTC), wg_2002
wrote:

On Fri, 18 Aug 2017 07:38:55 +0100, Martin Edwards wrote:

My defragmenter played up yesterday, but it worked this morning.
I have
the icons for Disc Cleanup and Defragmenter on my desktop as I
started
with W7. I now have W10 which does not let you get at the
programs. How
will I get them if I get a new computer?
Open file explorer and right click on your c: drive then click on
properties at the bottom. Under the tools tab click optimize and
that
will bring up the defragmenter/trim gui.
Doesn't W10 defrag in the background anyway, no need to run it
manually?

That's the default, but if your computer is turned off when defrag is
scheduled to run . . .

Isn't it about time MS released a filesystem that doesn't need
defragging? Other OSes have them. NTFS is a quarter of a century old
for
god's sake! Even with all those patches installed over the years, its
foundation is still ancient technology. Time to replace it with
something better. Or enable Windows so it can use other filesystems.

Stef

I think the current design of Windows 10, handles
the situation with intelligence. Regularly scheduled
defrags keep the "percentage" number low.

They have also added something (special handling) for certain
problem files. And this seems to have been added on the later
versions of Win10.

But I have ways of creating a mess. My "100% fragmented"
volume, a small volume by the way, was the result of doing
a chromium build. And just for fun, I had the defrag.exe take
a crack at that. Well, it needed a little help, because
I wasn't meeting the minimum white space requirement. There
was no place to move the files.

In average usage, stuff like this doesn't happen. If I'd done
the software build on a regular hard drive, with lots of slack, there
would have been no issue at all. But I did the build on a RAMdisk
which was barely big enough for the job. The fragmentation makes
no difference to a RAMDisk, but I thought it would be fun to
see whether the defragmenter utility could handle a pretty severe mess.
Once I'd helped it out, it worked fine. But it couldn't get past
the first bit. Its heuristic wasn't good enough.

Microsoft has another file system. This is newer than NTFS.

https://en.wikipedia.org/wiki/Refs

"In addition, Windows cannot be booted from a ReFS volume."

So apparently, that's not what it is for. And I cannot tell from
that article, whether fragmentation remains an issue or not.


Ah! Yes. The Ol' Resilient File System. Read about it. Seems designed
more for servers with RAID, multiple hard drives and partitions than
single drive/single partition consumer machines. Rumored to be a
future replacement for NTFS, but that was in 2012 when it was
released with MS Server 2012, and as you said you can't boot the OS from
it. So, you still need NTFS. So, NOT a replacement, but an addendum.

And as far as defragging: In all the reading I've done no where is it
explicitly stated that it doesn't need it. So I'll surmise it does just
like NTFS. (Except on SSDs, of course) Because if it didn't, Microsoft
would be shouting that fact from the highest roof tops all over the
world.

So, don't expect a full NTFS, no-defragging-needed, replacement file
system any time soon. When or if it happens, it will be a day late and a
dollar short. Such is the way of Microsoft. For now, you'll just have to
be satisfied humming along in that utilitarian '52 Buick with the holes
in the sides and Hydra-Matic transmission. ;-)


Stef

Hi Stef, Just to further your automotive knowledge, The 1952 Buick ran a
DynaFlow transmission, It was so slow on acceleration that we called it
a DipaFlow, The side holes where originaly engine compartment vents but
where later plugged.
From an old mechanic of the 50s and 60s, Now retired Power
Egineer. :-)

Regards, Rene


They were a lot simpler back then.

https://www.hemmings.com/magazine/hc...r/3695971.html

The new transmissions come with 3D pictures. That's
how much we've progressed. Just imagine what it would cost
to have to replace a thing like this. $5K ? $7K ?

http://media.caranddriver.com/images...s-1280x782.jpg

Paul
  #38  
Old August 22nd 17, 11:41 PM posted to alt.comp.os.windows-10
Char Jackson
external usenet poster
 
Posts: 10,449
Default Defrag

On Tue, 22 Aug 2017 19:01:30 +0000 (UTC), Stef
wrote:

On 21/8/2017 10:13, Char Jackson wrote:

On Mon, 21 Aug 2017 15:25:44 +0000 (UTC), Stef
wrote:

But defragging is an inconvienence at the very least.


Fragmentation is not an inconvenience for me. The last time I manually
defragged one of my own systems was at least 15 years ago. I don't even
own any of those systems anymore. None of my current PCs have ever been
manually defragged. If I check the fragmentation percentage, it's always
around 2-3%, which I don't think anyone would consider to be a problem.


Then you're one of the lucky ones.

And whether done manually or automatically, defragging still uses system
resources, and if you happen to be using the system at the same time
it will probably be sluggish. And if accessing the volume that is
being defragged, you may not be able to, or it will be slow if you
can, and there is the chance of a fault and you loose files, data or
the partition gets corrupted. That's why if a defrag is in progress, I
leave the system alone until it's done. Just to be safe.


Yes, of course, defragging uses system resources, just like anything
else, but it's not a reason to be afraid of poor system response or even
data loss. If you're using a defragger that has EVER caused you data
loss, or has EVER denied you access while it's running, I'd say two
things: Why do you feel you need to use a defragger, and why on earth
would you continue to use that one? Dump it ASAP. No respectable
defragger does any of those things.

The real point I was trying to make, though, is that you don't need to
worry about fragmentation. You apparently disagree, which is fine.

But on Windows defragging is still necessary whether it's automatic or
manual. And the whole point of this thread is that NTFS is OLD (25
years!) technology that is past due for replacement. MS did release
ReFS to address NTFS' shortcomings, but ReFS still needs defragging just
not as frequently -- as far as I can determine.


Agreed, NTFS is old, but what problem(s) are you having with it? If your
biggest, or only, complaint is fragmentation, then I'd say there really
isn't an issue. In the next section you mention some perceived
shortcomings, but fragmentation didn't make the list.

But there are still millions of
spinning drives out there, and will be for a long, long time. NTFS
lacks the improvements of today's modern file systems particularly
speed, reliability, and file integrity. Its basic design just does not
permit those features to be added to it. Hence ReFS as a stop-gap
"fix." It's time NTFS was replaced.

Just consider the time wasted by HAVING to defrag to keep HD performance
optimum and the real possibility of files being lost or damaged.
Wouldn't it be better not to have to?


I don't think we've had to for about 15 years.


You mean manually? Just because you yourself aren't defragging doesn't
mean it isn't being done and necessarily so. So, your statement is
misleading. And if your system is turned off -- most people don't leave
their machines running 24/7 -- when the defrag is scheduled, it
doesn't get done.


Your whole complaint has been that you don't think you should have to
defrag your hard disks. Well, you don't, and you haven't had to for a
very long time now. Yes, Windows takes care of it behind the scenes, but
Windows does a lot of things behind the scenes. If you routinely don't
have your PC on when a scheduled task is set to run, you might want to
reconfigure that task to run at a different time. Or every now and then,
leave the PC on overnight when you a task is scheduled to run, or just
run the task manually every so often.

Bottom line, in the vast majority of cases there's no reason to worry
about fragmentation anymore. Unless you just want to, of course. Some
people like the fragmentation graphic to be pretty. I get that.


  #39  
Old August 22nd 17, 11:54 PM posted to alt.comp.os.windows-10
Char Jackson
external usenet poster
 
Posts: 10,449
Default Defrag

On Tue, 22 Aug 2017 16:14:35 -0400, Paul wrote:

Microsoft provides an API specifically for
defragmentation. It's also used in some cases, for block
moves by various tools. The design is in fact, intended
to be "safe". Microsoft didn't write that originally,
and it was inherited from the contracting of the defrag
program. (The defrag in WinXP was written by a third-party.)


I seem to remember the built-in defrag being a stripped version of
DiskKeeper. Does that sound right?


  #40  
Old August 22nd 17, 11:54 PM posted to alt.comp.os.windows-10
Rene Lamontagne
external usenet poster
 
Posts: 2,549
Default Defrag

On 8/22/2017 3:40 PM, Paul wrote:
Rene Lamontagne wrote:
On 8/20/2017 1:45 PM, Stef wrote:
On 18/8/2017 09:20, Paul wrote:

Stef wrote:
On 18/8/2017 01:25, Davidm wrote:

On Fri, 18 Aug 2017 07:13:26 -0000 (UTC), wg_2002
wrote:

On Fri, 18 Aug 2017 07:38:55 +0100, Martin Edwards wrote:

My defragmenter played up yesterday, but it worked this morning.
I have
the icons for Disc Cleanup and Defragmenter on my desktop as I
started
with W7. I now have W10 which does not let you get at the
programs. How
will I get them if I get a new computer?
Open file explorer and right click on your c: drive then click on
properties at the bottom. Under the tools tab click optimize and
that
will bring up the defragmenter/trim gui.
Doesn't W10 defrag in the background anyway, no need to run it
manually?

That's the default, but if your computer is turned off when defrag is
scheduled to run . . .

Isn't it about time MS released a filesystem that doesn't need
defragging? Other OSes have them. NTFS is a quarter of a century
old for
god's sake! Even with all those patches installed over the years, its
foundation is still ancient technology. Time to replace it with
something better. Or enable Windows so it can use other filesystems.

Stef

I think the current design of Windows 10, handles
the situation with intelligence. Regularly scheduled
defrags keep the "percentage" number low.

They have also added something (special handling) for certain
problem files. And this seems to have been added on the later
versions of Win10.

But I have ways of creating a mess. My "100% fragmented"
volume, a small volume by the way, was the result of doing
a chromium build. And just for fun, I had the defrag.exe take
a crack at that. Well, it needed a little help, because
I wasn't meeting the minimum white space requirement. There
was no place to move the files.

In average usage, stuff like this doesn't happen. If I'd done
the software build on a regular hard drive, with lots of slack, there
would have been no issue at all. But I did the build on a RAMdisk
which was barely big enough for the job. The fragmentation makes
no difference to a RAMDisk, but I thought it would be fun to
see whether the defragmenter utility could handle a pretty severe mess.
Once I'd helped it out, it worked fine. But it couldn't get past
the first bit. Its heuristic wasn't good enough.

Microsoft has another file system. This is newer than NTFS.

https://en.wikipedia.org/wiki/Refs

"In addition, Windows cannot be booted from a ReFS volume."

So apparently, that's not what it is for. And I cannot tell from
that article, whether fragmentation remains an issue or not.

Ah! Yes. The Ol' Resilient File System. Read about it. Seems designed
more for servers with RAID, multiple hard drives and partitions than
single drive/single partition consumer machines. Rumored to be a
future replacement for NTFS, but that was in 2012 when it was
released with MS Server 2012, and as you said you can't boot the OS from
it. So, you still need NTFS. So, NOT a replacement, but an addendum.

And as far as defragging: In all the reading I've done no where is it
explicitly stated that it doesn't need it. So I'll surmise it does just
like NTFS. (Except on SSDs, of course) Because if it didn't, Microsoft
would be shouting that fact from the highest roof tops all over the
world.

So, don't expect a full NTFS, no-defragging-needed, replacement file
system any time soon. When or if it happens, it will be a day late and a
dollar short. Such is the way of Microsoft. For now, you'll just have to
be satisfied humming along in that utilitarian '52 Buick with the holes
in the sides and Hydra-Matic transmission. ;-)


Stef

Hi Stef, Just to further your automotive knowledge, The 1952 Buick ran
a DynaFlow transmission, It was so slow on acceleration that we called
it a DipaFlow, The side holes where originaly engine compartment vents
but where later plugged.
From an old mechanic of the 50s and 60s, Now retired Power
Egineer. :-)

Regards, Rene


They were a lot simpler back then.

https://www.hemmings.com/magazine/hc...r/3695971.html

The new transmissions come with 3D pictures. That's
how much we've progressed. Just imagine what it would cost
to have to replace a thing like this. $5K ? $7K ?

http://media.caranddriver.com/images...s-1280x782.jpg


Paul


At least $9K, I see some with 9 speeds!!! what a monster to overhaul!.
I did Powerglides, Hydra-Matics and The Chrysler 727 which was the
nicest one to overhaul, It was light, Sturdy and could handle 450 HP way
back in the 60s.

Rene


  #41  
Old August 23rd 17, 01:00 AM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Defrag

Char Jackson wrote:
On Tue, 22 Aug 2017 16:14:35 -0400, Paul wrote:

Microsoft provides an API specifically for
defragmentation. It's also used in some cases, for block
moves by various tools. The design is in fact, intended
to be "safe". Microsoft didn't write that originally,
and it was inherited from the contracting of the defrag
program. (The defrag in WinXP was written by a third-party.)


I seem to remember the built-in defrag being a stripped version of
DiskKeeper. Does that sound right?



That appears to be the one.

https://en.wikipedia.org/wiki/Diskeeper

"The defragmentation program which is included with
Windows 2000, Windows XP and Windows Server 2003 is
based on a basic version of Diskeeper.
"

Paul
  #42  
Old August 23rd 17, 07:49 AM posted to alt.comp.os.windows-10
Martin Edwards
external usenet poster
 
Posts: 181
Default Defrag

On 8/22/2017 9:14 PM, Paul wrote:
Stef wrote:

there is the chance of a fault and you loose files, data or
the partition gets corrupted. That's why if a defrag is in progress, I
leave the system alone until it's done. Just to be safe.


Actually, no. Microsoft provides an API specifically for
defragmentation. It's also used in some cases, for block
moves by various tools. The design is in fact, intended
to be "safe". Microsoft didn't write that originally,
and it was inherited from the contracting of the defrag
program. (The defrag in WinXP was written by a third-party.)
Now, that particular API is available to *all* third parties
wishing to defrag safely. The JKDefrag program for example,
the author can say "it's safe", because he calls only
the MS API to make changes to your disk.

They only way you might screw things up, is if defragmenting
a *damaged* file system.

An example of a damaged file system, would be one on WinXP,
where regular CHKDSK is not done. It's up to the user to
verify partitions there. Some users will never run CHKDSK,
and latent faults could accumulate in the file system.
Typically this happens, just before the disk is about
to die :-)

The latest Windows OSes, have a continuous CHKDSK process
they run. This reduces the odds of a file system being
damaged. So when you fire off a defrag, you can be
fairly confident there aren't any latent faults
in the file system being defragmented. This is better
than running fsck every hundred days :-)

And you can see this, when you run CHKDSK from the GUI.
(Right-click the partition in Win10, do Properties. Then
Tools, then Error Checking.) I've not seen a really detailed
article, describing what the background consistency checks
are, but Microsoft is confident enough, t be using the
language in this dialog box.

https://s29.postimg.org/j5gutzi9j/br..._interface.gif

The interface will put up a dialog to the effect "we haven't
seen any problems, but you can scan this anyway", and then
it allows the CHKDSK scan to proceed. The reason for the
snotty dialog, is to point out that a background maintenance
thing is running on that partition, to ensure there won't
be surprises.

If you do that on WinXP, there won't be that dialog, because
they *haven't* been running CHKDSK on your behalf. It's up
to you to CHKDSK/fsck every once in a while. And especially
if you did something like power-off in mid session. While the
Dirty Bit could be set per partition, it isn't always, and
I don't take chances here after power events, and I scan
everything.

The defragmenter can safely run, at the same time as the partition
is doing heavy I/O. The user would be an idiot, if attempting
that though. As the defragmenter cannot make much progress,
if thrashing (mixing) with other I/O present. File integrity
is still there, but no benefit would be gained from mixing
the two activities. Defrag is scheduled to run at night, but
nothing prevents Microsoft's ham-fisted maintenance activities,
from causing high I/O while the defrag runs (there could be
a three-hour Search Indexer operation in-flight, and I don't think
Windows Update really "waits" for any other process to stop).
I see all sorts of edge cases in Win10, where a better "order"
to maintenance would have worked. I only see this stuff,
because of the annoyance it causes.

Paul


What is an API?

--
Myth, after all, is what we believe naturally. History is what we must
painfully learn and struggle to remember. -Albert Goldman
  #43  
Old August 23rd 17, 07:52 AM posted to alt.comp.os.windows-10
Martin Edwards
external usenet poster
 
Posts: 181
Default Defrag

On 8/22/2017 11:41 PM, Char Jackson wrote:
On Tue, 22 Aug 2017 19:01:30 +0000 (UTC), Stef
wrote:

On 21/8/2017 10:13, Char Jackson wrote:

On Mon, 21 Aug 2017 15:25:44 +0000 (UTC), Stef
wrote:

But defragging is an inconvienence at the very least.

Fragmentation is not an inconvenience for me. The last time I manually
defragged one of my own systems was at least 15 years ago. I don't even
own any of those systems anymore. None of my current PCs have ever been
manually defragged. If I check the fragmentation percentage, it's always
around 2-3%, which I don't think anyone would consider to be a problem.


Then you're one of the lucky ones.

And whether done manually or automatically, defragging still uses system
resources, and if you happen to be using the system at the same time
it will probably be sluggish. And if accessing the volume that is
being defragged, you may not be able to, or it will be slow if you
can, and there is the chance of a fault and you loose files, data or
the partition gets corrupted. That's why if a defrag is in progress, I
leave the system alone until it's done. Just to be safe.


Yes, of course, defragging uses system resources, just like anything
else, but it's not a reason to be afraid of poor system response or even
data loss. If you're using a defragger that has EVER caused you data
loss, or has EVER denied you access while it's running, I'd say two
things: Why do you feel you need to use a defragger, and why on earth
would you continue to use that one? Dump it ASAP. No respectable
defragger does any of those things.

The real point I was trying to make, though, is that you don't need to
worry about fragmentation. You apparently disagree, which is fine.

But on Windows defragging is still necessary whether it's automatic or
manual. And the whole point of this thread is that NTFS is OLD (25
years!) technology that is past due for replacement. MS did release
ReFS to address NTFS' shortcomings, but ReFS still needs defragging just
not as frequently -- as far as I can determine.


Agreed, NTFS is old, but what problem(s) are you having with it? If your
biggest, or only, complaint is fragmentation, then I'd say there really
isn't an issue. In the next section you mention some perceived
shortcomings, but fragmentation didn't make the list.

But there are still millions of
spinning drives out there, and will be for a long, long time. NTFS
lacks the improvements of today's modern file systems particularly
speed, reliability, and file integrity. Its basic design just does not
permit those features to be added to it. Hence ReFS as a stop-gap
"fix." It's time NTFS was replaced.

Just consider the time wasted by HAVING to defrag to keep HD performance
optimum and the real possibility of files being lost or damaged.
Wouldn't it be better not to have to?

I don't think we've had to for about 15 years.


You mean manually? Just because you yourself aren't defragging doesn't
mean it isn't being done and necessarily so. So, your statement is
misleading. And if your system is turned off -- most people don't leave
their machines running 24/7 -- when the defrag is scheduled, it
doesn't get done.


Your whole complaint has been that you don't think you should have to
defrag your hard disks. Well, you don't, and you haven't had to for a
very long time now. Yes, Windows takes care of it behind the scenes, but
Windows does a lot of things behind the scenes. If you routinely don't
have your PC on when a scheduled task is set to run, you might want to
reconfigure that task to run at a different time. Or every now and then,
leave the PC on overnight when you a task is scheduled to run, or just
run the task manually every so often.

Bottom line, in the vast majority of cases there's no reason to worry
about fragmentation anymore. Unless you just want to, of course. Some
people like the fragmentation graphic to be pretty. I get that.


My computer is often only on for an hour a day. I still have the icons
inherited from W7. I had one part of my question answered by the first
reply, and worked the other part out myself, so I know what to do if I
change my computer.

--
Myth, after all, is what we believe naturally. History is what we must
painfully learn and struggle to remember. -Albert Goldman
  #44  
Old August 23rd 17, 09:05 AM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Defrag

Martin Edwards wrote:

What is an API?


https://en.wikipedia.org/wiki/Applic...ming_interface

"A good API makes it easier to develop a computer program by
providing all the building blocks, which are then put
together by the programmer."

Having a collection of "building blocks", saves a *lot* of labor.

Paul
  #45  
Old August 23rd 17, 05:21 PM posted to alt.comp.os.windows-10
Stef
external usenet poster
 
Posts: 364
Default Defrag

On 22/8/2017 12:19, Rene Lamontagne wrote:

On 8/20/2017 1:45 PM, Stef wrote:
On 18/8/2017 09:20, Paul wrote:

[big, big snip]


Stef

Hi Stef, Just to further your automotive knowledge, The 1952 Buick ran a
DynaFlow transmission, It was so slow on acceleration that we called it
a DipaFlow, The side holes where originaly engine compartment vents but
where later plugged.
From an old mechanic of the 50s and 60s, Now retired Power
Egineer. :-)


Yes, I knew Hydra-Matic wasn't in the Buicks. I just couldn't remember
the Dyna-Flow name. Thanks for reminding me. Know about the holes,
too. Later just nonfunctional, artsy-fartsy design element like fins
that would infect car design later.

Stef
 




Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off






All times are GMT +1. The time now is 05:06 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.