Thread: C:\ Full
View Single Post
  #33  
Old July 8th 18, 12:52 AM posted to alt.windows7.general,alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default C:\ Full

Char Jackson wrote:
On Sat, 7 Jul 2018 14:25:06 -0400, Keith Nuttle
wrote:

On 7/7/2018 12:54 PM, Char Jackson wrote:
On Sat, 7 Jul 2018 10:19:55 -0400, Keith Nuttle
wrote:

I am the one who suggested deleting files in SoftwareDistribution, and I
can confirm that this folder can take up a lot of space. I never
checked the exact number of bytes, but I know I have spent hours
deleting the files.
Hi Keith,
I'm curious to hear why it took hours to delete those files, when I
think it should have taken less than a minute. Do you have any thoughts
on what was going on?

Yes!
First there were many folders, sub folders and files. I have selected
all files in a folder and it has shown 6000 files area being deleted.

Second it was on my tablet which is limited memory and a slow processor.

I found an article mentioned earlier in this thread, and was shocked a
the number of files I found in this folder. When you are deleting that
many files even on a faster computer it takes some time.


The tablet is a touch tablet, which makes makes it a pain to make the
inputs, and I have never found how to do the operation mouse operation
Click, Shift Click to select multiple files on the touch screen.
Even if you do a select all, and delete; with thousands of files you
never know if the system is hung up or is working. (I can be a bit
impatient.) This is how I learned about setting the immediate delete
parameter in the Recycle bin.


Thanks, Keith, and thanks also to Paul for contributing his experience,
as well. I guess I've seen something similar when I needed to delete
over 100,000 jpg images and it took much longer than I expected. Not
hours, but several minutes, as I recall. On XP, that same operation used
to take about 10-20 seconds.


There is a stark difference, between how Explorer.exe
works when dealing with files, and how a shell-level
FindNext works.

I can do "dir" in a Command Prompt window, and easily list
8 million files in the window. They're sorted in alphabetical
order in three columns. That means the command knew pretty
close to the beginning of the output, what it had to do.

Explorer.exe on the other hand, fails in the simplest
and least taxing of situations. It should be a
"case study for CS class" for somebody.

What I can't figure out, is how "dir" and FindNext
are able to sort a 40GB $MFT in no time at all, and
start outputting file names... in alphabetical order.
It's fast enough that "there's got to be a trick".

Whereas with File Explorer, around 1 million files is
a practical upper limit for "expecting the window
to ever paint". And you can find/create failure
test cases for Explorer with as few as 60,000 files
in a single folder (the window will have a "busy icon"
forever).

Paul
Ads