A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Windows 10 » Windows 10 Help Forum
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops



 
 
Thread Tools Rate Thread Display Modes
  #76  
Old September 28th 18, 12:33 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Quick assessment of 3 Windows tools to read/write Linux filesystemson dual-boot desktops

Char Jackson wrote:
On Thu, 27 Sep 2018 17:05:06 -0400, Paul wrote:

And I have used FAT32 on my sneakernet USB3 flash sticks.
and then just use 7ZIP in segmented mode, to get around
the 4GB limitation. To transfer a 20GB file, 7ZIP would
write 5 or 6 files to represent the whole thing, and these
can be unpacked as part of the copy operation at the
other end. It's not "highly convenient", but... it works.


There's this thing called an Ethernet cable... ;-)


I could write a long reply, but I won't :-)

What we do in life, depends on the situation.

Paul
Ads
  #77  
Old September 28th 18, 12:53 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default NTFS

Char Jackson wrote:
On Thu, 27 Sep 2018 16:56:02 -0400, Paul wrote:

NY wrote:

Does anyone know why the Windows command can't go bigger than 32 GB

Plain and simple foolishness.

It's like the pretense of holding your little
finger in the air when drinking a cup of tea.


One of the conspiracy theories floating around out there is that
Microsoft imposed that limit (only in their format command, not anywhere
in the filesystem itself) to encourage folks to migrate to NTFS.


Sure, that's the first theory that comes to mind.

Paul

  #78  
Old September 28th 18, 01:45 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general,alt.comp.freeware
nospam
external usenet poster
 
Posts: 4,718
Default Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops

In article , NY
wrote:


What is the best filesystem to use which can handle large files 4 GB and
which is safe for both Windows and non-Windows to write to if a disc is
shared between the two (eg used as a means of copying files between the
two)? Is exFAT any safer than NTFS?


exfat and yes for that purpose.

or use a nas.
  #79  
Old September 28th 18, 02:42 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Char Jackson
external usenet poster
 
Posts: 10,449
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

On Thu, 27 Sep 2018 16:26:32 -0700, Ken Blake
wrote:

On Thu, 27 Sep 2018 17:42:43 -0500, Char Jackson
wrote:

On Thu, 27 Sep 2018 21:10:59 +0200, "R.Wieser"
wrote:

You might be on to something, though. Rudy might have been
limiting his thinking to formatting a 1 TB drive via the Windows
format command,

Yes, I have. Funny how you, or at least I, tend to use the OSes own tools
to manage it. No idea why that would be. Do you have any ?


In many cases, Windows tools are far from the best available, including
formatting tools, partition management, system/data backups,
antivirus/antimalware, and so on.



Not to mention browsers, e-mail programs, word processors, graphics
editors, etc. Yes, these aren't strictly tools, but the same thing
applies to such programs.


If you're limiting yourself to what
Windows provides, you're probably coming up short.




Yes. Unfortunately many (probably most) people do that, either because
they think that what comes with Windows is the best or because they
don't realize they have other choices.



Thanks, Ken. We're in full agreement. As you've said before, MS does
some things well and other things not so well. For the latter, we're
fortunate to have options.

  #80  
Old September 28th 18, 08:30 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
R.Wieser
external usenet poster
 
Posts: 1,302
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

Char,

Rudy, if that's what you were referring to, my apologies for not
picking up on it sooner.


Not picking up on someone, rather obviously, referring to the standard
installed-and-available tools. Hmm....

Well, I'm not someone who can hold a grudge well (costs me too much energy),
so I guess I'll accept it. Forget it, it never happened (just don't do it
again. :-) )

There's still a question about why you'd suggest doing it that way


You're mistaken, I didn't. I gave it as a *problem* when using FAT32 (with
the standard tools).

I followed it up with :
Quote:
what filesystem would you [Diesel] advice to use on Windows if-and-when
someone would not like to be bound to the OS ?
So, forget FAT32 - I would like to be able to save big files, like images of
certain DVDs, on my 'puter thankyouverymuch - and please suggest something
that could replace/further the capabilities as NTFS has them. And I do
like its hardlink feature (even though XP, above kernel level, hasn't got
any support for it). Allows me to make sets of full, browsable data
backups without having a gazillion of duplicate files.

So, any ideas ?

Even if I'm rather hesitant to use such third-party stuff, I'm not dumb
enough to outright reject them. Its how I came to use FF instead of IE -
even though I still use OE to email and read newsgroups.

Regards,
Rudy Wieser


  #81  
Old September 28th 18, 09:34 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
R.Wieser
external usenet poster
 
Posts: 1,302
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

Char,

If you can boot an XP install CD, you can presumably boot
another CD before that, containing utilities to manage your
drive and its partitions.


Thats possible, but definitily non-standard and needs you to be (very?)
aware of what you're doing - and why. Besides being able to infer, and
accept the possible risks of it ofcourse.

That's what I remember doing back in the day.


Consider me conservative, but I rather let an installer run its due course
(even if I often go for the "advanced" mode :-) ), less chance of running
into problems (at that moment, or somewhere in the future) that way.

You're right, in hindsight "no one" was overly harsh. I used it
as a figure of speech,


I know its most always used as a (bad) "figure of speech". But as I felt
you where yanking my chain I felt no qualms in dotting some "i"s.

but perhaps not a particularly good one.


Seeing that its most often used in an adversary way (claiming that "I have
the world backing me up, and you are alone") I would dare to say that its
most never a good idea to use.

Than again, my guards went up and at such times I have a hard time in just
letting "probably not ment as such" stuff slide. Mea culpa.

Thanks.


You're welcome. Though I'm not quite sure what for: calling you an ass or
for having provided the FAT32 info. Maybe even both. :-)

Regards,
Rudy Wieser


  #82  
Old September 28th 18, 11:33 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default NTFS

Paul wrote:
NY wrote:

Does anyone know why the Windows command can't go bigger than 32 GB


Plain and simple foolishness.

It's like the pretense of holding your little
finger in the air when drinking a cup of tea.

Sure, it might not be a very efficient choice (making a 2TB drive into
a single FAT32). But this is a choice the user should get to make
for themselves.

I don't think I have any long-term volumes here over about 100GB
for FAT32, so I don't make a habit of testing the limit.

At one time, a WinXP machine might come with 512MB of RAM. The size
of the FAT table for a 2TB drive, might not work very well with
only 512MB of RAM. If I had a table of FAT size versus volume
dimension, I might have an easy reference to it. A second
best way, is to use fat32formatter, as it tells you
how much writes it did when preparing a drive (creating
an empty FAT). If you have a spare drive, you can make your
own table of values in five minutes work (keep bumping
the size of the partition, run the tool, get the FAT
size number).

Paul


This is the result of some size testing.

The first command uses "defaults" in fat32format.exe .
It wrote two 256MB FATs for the ~2TB volume.
The cluster size was 32768 bytes.

The second command uses "cluster override".
The result is two 1GB FATS for the ~2TB volume.
The cluster size was 8192 bytes. I would
likely be refused, if I tried setting the
cluster size one notch lower at 4096 bytes.

*******
C:\ fat32format h:

Warning ALL data on drive 'h' will be lost irretrievably, are you sure
(y/n) :y
FSCTL_ALLOW_EXTENDED_DASD_IO OK
IOCTL_DISK_GET_PARTITION_INFO failed, trying IOCTL_DISK_GET_PARTITION_INFO_EX
IOCTL_DISK_GET_PARTITION_INFO_EX ok, GPTMode=1
Size : 2097GB 4096000000 sectors
512 Bytes Per Sector, Cluster size 32768 bytes
Volume ID is 1f1d:217d
32 Reserved Sectors, 499878 Sectors per FAT, 2 fats
63984378 Total clusters === limits total files
63984377 Free Clusters
Formatting drive h:...
Clearing out 999852 sectors for Reserved sectors, fats and root cluster...
Wrote 511924224 bytes in 2.55 seconds, 191.12 Megabytes/sec
Initialising reserved sectors and FATs...
Done

C:\ fat32format -c16 h:

Warning ALL data on drive 'h' will be lost irretrievably, are you sure
(y/n) :y
FSCTL_ALLOW_EXTENDED_DASD_IO OK
IOCTL_DISK_GET_PARTITION_INFO failed, trying IOCTL_DISK_GET_PARTITION_INFO_EX
IOCTL_DISK_GET_PARTITION_INFO_EX ok, GPTMode=1
Size : 2097GB 4096000000 sectors
512 Bytes Per Sector, Cluster size 8192 bytes
Volume ID is 8e3:444d
32 Reserved Sectors, 1998049 Sectors per FAT, 2 fats
255750241 Total clusters === room for max files
255750240 Free Clusters
Formatting drive h:...
Clearing out 3996146 sectors for Reserved sectors, fats and root cluster...
Wrote 2046026752 bytes in 10.38 seconds, 187.90 Megabytes/sec
Initialising reserved sectors and FATs...
Done

*******

I don't know the algorithm for FAT handling, whether the
entire thing is held in memory, or (like a lot of stuff),
the FAT is memory mapped and only loaded as needed.

Paul
  #83  
Old September 28th 18, 06:36 PM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Char Jackson
external usenet poster
 
Posts: 10,449
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

On Fri, 28 Sep 2018 09:30:12 +0200, "R.Wieser"
wrote:

So, forget FAT32 - I would like to be able to save big files, like images of
certain DVDs, on my 'puter thankyouverymuch - and please suggest something
that could replace/further the capabilities as NTFS has them. And I do
like its hardlink feature (even though XP, above kernel level, hasn't got
any support for it). Allows me to make sets of full, browsable data
backups without having a gazillion of duplicate files.

So, any ideas ?


No, I'm afraid not. NTFS is still serving me quite well, so I haven't
considered what else is out there.

Are you looking for (only) Windows support, or must it be
cross-platform? Can it be Linux only, such as Reiser4? Max file size is
8 TiB.
https://en.wikipedia.org/wiki/Reiser4

You've probably been through this chart...
https://en.wikipedia.org/wiki/Compar...f_file_systems


  #84  
Old September 28th 18, 06:54 PM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default NTFS

Paul wrote:
Paul wrote:
NY wrote:

Does anyone know why the Windows command can't go bigger than 32 GB


Plain and simple foolishness.

It's like the pretense of holding your little
finger in the air when drinking a cup of tea.

Sure, it might not be a very efficient choice (making a 2TB drive into
a single FAT32). But this is a choice the user should get to make
for themselves.

I don't think I have any long-term volumes here over about 100GB
for FAT32, so I don't make a habit of testing the limit.

At one time, a WinXP machine might come with 512MB of RAM. The size
of the FAT table for a 2TB drive, might not work very well with
only 512MB of RAM. If I had a table of FAT size versus volume
dimension, I might have an easy reference to it. A second
best way, is to use fat32formatter, as it tells you
how much writes it did when preparing a drive (creating
an empty FAT). If you have a spare drive, you can make your
own table of values in five minutes work (keep bumping
the size of the partition, run the tool, get the FAT
size number).

Paul


This is the result of some size testing.

The first command uses "defaults" in fat32format.exe .
It wrote two 256MB FATs for the ~2TB volume.
The cluster size was 32768 bytes.

The second command uses "cluster override".
The result is two 1GB FATS for the ~2TB volume.
The cluster size was 8192 bytes. I would
likely be refused, if I tried setting the
cluster size one notch lower at 4096 bytes.

*******
C:\ fat32format h:


Capacity testing: Pass

Accepts ~500 of ~4GB files.

https://i.postimg.cc/KY9j5y7K/defaul...FAT32_2_TB.gif

C:\ fat32format -c16 h:


Capacity testing: Fail

Accepts ~243 of ~4GB files.

https://i.postimg.cc/g2gLgRmk/small_...FAT32_2_TB.gif

The conclusion is, it *is* a good idea to test your
setup before having it absolutely chock-full of files.

The "max file test" will likely take too long to
complete. The above test cases take maybe 5-7 minutes
to run (using a virtual disk attached to Disk Management).
I didn't wait two hours for a real disk to run those.

Paul
  #85  
Old September 28th 18, 08:08 PM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
R.Wieser
external usenet poster
 
Posts: 1,302
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

Char,

NTFS is still serving me quite well, so I haven't
considered what else is out there.


It does still serve me well too.

But I have an simple partition backupping idea in my head for which I need
to be able to absolutily trust that I can extract a used-sector map from the
raw filesystem. As NTFS is proprietary and seems to change without notice
....

Are you looking for (only) Windows support, or must it
be cross-platform?


As long as it supports/runs under Windows its good enough for me. But if
given a choice it would get bonus points if it could be read/written by
Linux too.

Can it be Linux only, such as Reiser4?


No. Its supposed to run on a Windows XPsp3 machine, as a replacement for
NTFS.

You've probably been through this chart...


I think I've seen it, but have too little/no knowledge about most of whats
on that list (apart from a few FAT versions and a bit from ISO 9660:1988) to
make even a halfway informed choice. Also, "der qual der wahl" (too much
to choose from). Which is why I asked for some help/hints.

Regards,
Rudy Wieser


  #86  
Old September 29th 18, 05:04 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default NTFS

Paul wrote:
Paul wrote:
Paul wrote:
NY wrote:

Does anyone know why the Windows command can't go bigger than 32 GB

Plain and simple foolishness.

It's like the pretense of holding your little
finger in the air when drinking a cup of tea.

Sure, it might not be a very efficient choice (making a 2TB drive into
a single FAT32). But this is a choice the user should get to make
for themselves.

I don't think I have any long-term volumes here over about 100GB
for FAT32, so I don't make a habit of testing the limit.

At one time, a WinXP machine might come with 512MB of RAM. The size
of the FAT table for a 2TB drive, might not work very well with
only 512MB of RAM. If I had a table of FAT size versus volume
dimension, I might have an easy reference to it. A second
best way, is to use fat32formatter, as it tells you
how much writes it did when preparing a drive (creating
an empty FAT). If you have a spare drive, you can make your
own table of values in five minutes work (keep bumping
the size of the partition, run the tool, get the FAT
size number).

Paul


This is the result of some size testing.

The first command uses "defaults" in fat32format.exe .
It wrote two 256MB FATs for the ~2TB volume.
The cluster size was 32768 bytes.

The second command uses "cluster override".
The result is two 1GB FATS for the ~2TB volume.
The cluster size was 8192 bytes. I would
likely be refused, if I tried setting the
cluster size one notch lower at 4096 bytes.

*******
C:\ fat32format h:


Capacity testing: Pass

Accepts ~500 of ~4GB files.

https://i.postimg.cc/KY9j5y7K/defaul...FAT32_2_TB.gif

C:\ fat32format -c16 h:


Capacity testing: Fail

Accepts ~243 of ~4GB files.

https://i.postimg.cc/g2gLgRmk/small_...FAT32_2_TB.gif

The conclusion is, it *is* a good idea to test your
setup before having it absolutely chock-full of files.

The "max file test" will likely take too long to
complete. The above test cases take maybe 5-7 minutes
to run (using a virtual disk attached to Disk Management).
I didn't wait two hours for a real disk to run those.

Paul


Well, you don't have to worry about the Wikipedia claim
that FAT32 supports ~250 million files. I started testing
that on Windows 10. It got to around 3 million files
and the machine crashed. Sweet.

The thing is, I was watching all the resource indicators
I could put on the screen. And it ran out of *something*,
but I don't know what. There was still memory available.

And the crash includes "turning off the video card" as
a symptom. The monitor indicates "No signal".

In previous tests of this type, I used to dump all the
files into one folder. But Explorer doesn't like seeing
more than around 50-60K files in a folder, before the
spinning wheel of death appears. So I figured this time,
I'd be kind to Explorer by dumping the files in a tree.

The test program created folders from

outdir\a\a\a\a\a to outdir\j\j\j\j\j

such that there are roughly 10^5 folders at the bottom level.
Then, as each bottom level folder is created, a thousand
files are written into the folder. This allows Explorer to
retain bladder control if you traverse the tree and look
at the files.

But it still didn't help, and somewhere around 3 million files,
the machine crashed.

Paul
  #87  
Old September 29th 18, 08:47 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
R.Wieser
external usenet poster
 
Posts: 1,302
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

Char,

I woke up this morning realizing I was probably barking up the wrong tree.

The only partition that (I assume) needs a full (sector-by-sector) backup
would be the OS partition. And as that one, on my machine, only occupies
just above 6 GByte its way small enough to fit on even an MS limited FAT32
partition. And FAT32 seems to be well documented.

The only thing I need to do is to move the temp directory (as used by the
browser for temp storage while downloading) to an NTFS partition (so I can
download large files like DVD images). And IIRC I already encountered some
information to how to do that (move the whole "documents and
settings\{username}" subtree to another spot).

In other words, my idea of replacing the filesystem (with a 3d-party
solution) is leaving thru the side door as far as I'm concerned.

Thanks for your willingnes to help though.

Regards,
Rudy Wieser


  #88  
Old September 29th 18, 09:41 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

In message , R.Wieser
writes:
Char,

I woke up this morning realizing I was probably barking up the wrong tree.

The only partition that (I assume) needs a full (sector-by-sector) backup
would be the OS partition. And as that one, on my machine, only occupies
just above 6 GByte its way small enough to fit on even an MS limited FAT32
partition. And FAT32 seems to be well documented.


For XP (and I think 7), it doesn't even need sector-by-sector, if by
backup, you mean something that, when restored from, will restore a
working system. I have used Macrium (version 5 seems fine for XP and 7;
version 5 - and I think 6 and 7 - will even fit on a mini-CD). [Other
imaging softwares are available (Acronis being the main other one talked
about).] I have used Macrium to make an image file (on an external, USB,
disc - a bare system booted from the Macrium CD can see such a disc) to
make an image file containing just C: and the hidden partition; when
restored from that image file to a new, blank, disc, the system then
booted fine from that drive. I have done this with both XP and 7 (32
bit) - first time to replace a failing drive, second just to fit a
bigger one. The image file is about the size of the _used_ parts of
C-plus-hidden, so obviously wasn't a sector-by-sector copy. (Well, it
can actually be smaller - Macrium offers compression, which is set to on
[at light] by default. I usually turn it off.)

Obviously, not doing sector-by-sector means I'd be unable to "restore"
deleted files on the new drive, as they'd not be there.

The only thing I need to do is to move the temp directory (as used by the
browser for temp storage while downloading) to an NTFS partition (so I can
download large files like DVD images). And IIRC I already encountered some
information to how to do that (move the whole "documents and
settings\{username}" subtree to another spot).


(IIRR, it's under Properties, though hard to find.)

In other words, my idea of replacing the filesystem (with a 3d-party
solution) is leaving thru the side door as far as I'm concerned.


If the _purpose_ - rather than idea - is to have a "backup" so you can
restore a working system, imaging is quicker and uses less space. (If
the backup drive happens to be FAT32, Macrium will transparently split
the image into 4G chunks, and reassemble from them on restoration. I
discovered this recently when imaging a friend's system without
realising her backup drive was FAT - I'd actually done a restoration
from it before I found out, so that process is definitely transparent!)

Thanks for your willingnes to help though.

Regards,
Rudy Wieser


--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

I have never liked children, even when I was one.
- Miriam Margolyes (RT 2014/11/1-7)
  #89  
Old September 29th 18, 11:00 AM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
R.Wieser
external usenet poster
 
Posts: 1,302
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

John,

For XP (and I think 7), it doesn't even need sector-by-sector, if by
backup, you mean something that, when restored from, will restore a
working system.


Yes, thats what I am aiming for: a way to *easily* restore to a working
system (as in: select the backup, confirm restoration, wait until done).
As for the sector-by-sector assumption, I guess thats my DOS era knowledge
interfering. I'm still assuming that files with the "s" attribute are not
to be moved in any way - which a file restore would most likely cause.

Than again, I've seen several backup utilities show the filenames while
backupping (which even happens when using MSes own "rescue disk" method) ...

I have used Macrium


I'm not sure if it was Marcium I tried last, but I was trully appalled.

I had to work my way thru a gazillion of options, with no real info about
what I *should* be choosing, to make a backup. And I must assume that I
have to go thru a similar gazillion of choices to restore it. Thats just
begging for trouble (possible incompatible backup and restore choices).

Even if I can allow/understand that some questions need to be asked while
backupping, the answers to them should, as far as I'm concerned, be stored
together with the backup, and *at least* be offered as the default choice
when restoring.

Just consider the possibility that the person doing the restore is not the
one who made the backup, and might not be as "in the know".

Obviously, not doing sector-by-sector means I'd be unable to "restore"
deleted files on the new drive, as they'd not be there.


True. But if I would be bothered by that I would not use shift-delete for
most all of my removal actions (but just allow them to be placed into the
"waste basket" instead).

Besides, I use my OS partition for (as much as possible) exactly that. Data
gets stored on another partition (on which I can, and do file backups).
Even my FireFox profile is located on a data partition.

(IIRR, it's under Properties, though hard to find.)


Peeking into a document I saved back in '14, its also(ofcourse) in the
registry, under

HKCU\Software\Microsoft\Windows\CurrentVersion\Exp lorer\Shell Folders\Local
Settings

.... thoug I will probably re-search for the MS method, as just changing the
path does not move the already present data (which a build-in method will
probably do).

If the _purpose_ - rather than idea - is to have a "backup" so you can
restore a working system, imaging is quicker and uses less space


I'm not so sure about that. Not in speed, nor in size. For some reason
only backing up the files (as in: without slack space) looks to me to be a
good candidate for "as small as possible".

Ofcourse, when using a partition image you do not need to actually know
anything about the involved filesystem to be able to do a restore. Which
means that while the backup process could need a big OS (capable of
understanding the filesystem to be able to figure out which sectors to
backup and which ones to ignore), the restoring OS could be comparitivily
simple and thus small.

Hmm ... brainfart: I have to see if Windows "shadow copy" can also work on
sector level. That way I could use Windows itself to make the sparse disk
image.

If the backup drive happens to be FAT32, Macrium will transparently split
the image into 4G chunks, and reassemble from them on restoration


Yeah, that problem crossed my mind too. But to be honest, it looks to be
the easiest one to solve.

Regards,
Rudy Wieser


  #90  
Old September 29th 18, 12:20 PM posted to alt.comp.os.windows-10,microsoft.public.windowsxp.general,alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default NTFS (was: Quick assessment of 3 Windows tools to read/write Linux filesystems on dual-boot desktops)

In message , R.Wieser
writes:
John,

For XP (and I think 7), it doesn't even need sector-by-sector, if by
backup, you mean something that, when restored from, will restore a
working system.


Yes, thats what I am aiming for: a way to *easily* restore to a working
system (as in: select the backup, confirm restoration, wait until done).


Yup, that's how it worked for me. (There is a "shutdown when complete"
option; I usually leave that off, as I like to come back and _see_ the
"restore completed successfully" message.)

As for the sector-by-sector assumption, I guess thats my DOS era knowledge
interfering. I'm still assuming that files with the "s" attribute are not
to be moved in any way - which a file restore would most likely cause.


(I don't _think_ "s" meant that; I'm not sure what it _does_ mean,
actually. I know it stands for "system", but not what that _means_.
_Possibly_ it's from two-floppy days, and when you did a "format /sys
b:" - or was it just "sys b:", I forget - those were the files that got
copied to the newly-formatted floppy?) I haven't come across files that
had to be in a specific place on the disc for decades, when some
software used that as an anti-piracy measure; having said that, maybe
there _are_ files in XP/7 that have to be in a specific place - if there
are, then Macrium's image/restore process puts them there.

Than again, I've seen several backup utilities show the filenames while
backupping (which even happens when using MSes own "rescue disk" method) ...

I have used Macrium


I'm not sure if it was Marcium I tried last, but I was trully appalled.

I had to work my way thru a gazillion of options, with no real info about
what I *should* be choosing, to make a backup. And I must assume that I
have to go thru a similar gazillion of choices to restore it. Thats just
begging for trouble (possible incompatible backup and restore choices).


Hmm. Well, I found it not too difficult to follow. I always do my
imaging by booting the machine from the mini-CD I made (apart from
anything else, that reassures me that the CD has not "gone bad": I
_don't_ want to discover that when I actually come to need it!). Most
people here [not sure where "here" is - I think I mean XP and 7 'groups]
seem to use its ability to run from inside a running Windows; I don't
feel secure doing that. Making the CD is one of the options from Macrium
running inside Windows; I think that's the only time I've ever used the
from-inside-Windows version of Macrium.

When you boot from the CD (takes a few minutes), it defaults to assuming
you want to do a restore, so you have to select the imaging tab instead.
Then it defaults to assuming you want to image every partition in sight,
which maybe is a good default; for people like you and I who keep data
and OS/software as separate as we can, the next step is then to *un*tick
all but the OS partition and any hidden partitions it spots; then just
choose which drive to put the image on and what to call it (it defaults
to the current date as the name IIRR). OK, it probably gives the
external drive (if that's what you're using) a different letter to what
you're used to, but you should be able to identify it by its size.

Even if I can allow/understand that some questions need to be asked while
backupping, the answers to them should, as far as I'm concerned, be stored
together with the backup, and *at least* be offered as the default choice
when restoring.


When I've restored from an image to a new drive, it has done it: it
created the C: and hidden partitions, and whatever needed doing to the
boot sector, master file table, and so on, so that the system would boot
from the new disc. (I don't know much about boot sector, MFT, etc.: I am
glad not to have to!) It recreated them the same size as on the disc
they were imaged from, and I used a partition manager to enlarge C:, and
create D:, afterwards; I think Macrium could have enlarged C: while
restoring, but I liked to do things one step at a time.

Just consider the possibility that the person doing the restore is not the
one who made the backup, and might not be as "in the know".


OK, you'd have to give _some_ instructions: it doesn't make a bootable
image that restores itself. (It might be _possible_ to, but I don't know
how. Probably not if the image is on an external drive - you'd need to
boot something that could then _access_ the external drive.)

Obviously, not doing sector-by-sector means I'd be unable to "restore"
deleted files on the new drive, as they'd not be there.


True. But if I would be bothered by that I would not use shift-delete for
most all of my removal actions (but just allow them to be placed into the
"waste basket" instead).

Me too (-:

Besides, I use my OS partition for (as much as possible) exactly that. Data
gets stored on another partition (on which I can, and do file backups).


Yes, I basically just _copy_ my D: from time to time. (OK, I use Synctoy
- cycling between two or three destination folders - which speeds the
process considerably, after the first time.)

Even my FireFox profile is located on a data partition.

(IIRR, it's under Properties, though hard to find.)


Peeking into a document I saved back in '14, its also(ofcourse) in the
registry, under

HKCU\Software\Microsoft\Windows\CurrentVersion\Exp lorer\Shell Folders\Local
Settings

... thoug I will probably re-search for the MS method, as just changing the
path does not move the already present data (which a build-in method will
probably do).


It does, once you've found it. IIRR, it actually offers to do so, rather
than doing it automatically; it's a while since I did it. I'm pretty
sure it's just under Properties of the relevant folder or pseudo-folder:
it's just a matter of selecting the right such f-or-p-f to look at the
Properties of.

If the _purpose_ - rather than idea - is to have a "backup" so you can
restore a working system, imaging is quicker and uses less space


I'm not so sure about that. Not in speed, nor in size. For some reason
only backing up the files (as in: without slack space) looks to me to be a
good candidate for "as small as possible".


Imaging - at the default setting in Macrium, anyway - does make an image
of only the used space. It does enough more than just copying the files,
though, that the restore is bootable.

Ofcourse, when using a partition image you do not need to actually know
anything about the involved filesystem to be able to do a restore. Which
means that while the backup process could need a big OS (capable of
understanding the filesystem to be able to figure out which sectors to
backup and which ones to ignore), the restoring OS could be comparitivily
simple and thus small.


Well, as I said, the Macrium CD I made fits on a mini-floppy, so is
about 200M. IIRR, you get two options when making the CD - one that
looks a lot like Windows (I think it actually says "loading Windows"
while booting), and one that's more spartan; I chose the Windows one.
Whatever it makes, it knows enough about filesystems to restore from the
image file (one image file for multiple partitions) and make something
bootable. It also knows about USB-connected drives (NTFS or FAT) for
storing/retrieving the image file; I've always used a USB-connected hard
drive, but I suspect it knows about memory sticks too. (I just don't
like the idea of using those for backup.) I think it can use networked
drives too: when booting, it pauses and says do you want to load extra
drivers for those, and I've always said no at that point, since I'm
using a USB one.

Hmm ... brainfart: I have to see if Windows "shadow copy" can also work on
sector level. That way I could use Windows itself to make the sparse disk
image.

If the backup drive happens to be FAT32, Macrium will transparently split
the image into 4G chunks, and reassemble from them on restoration


Yeah, that problem crossed my mind too. But to be honest, it looks to be
the easiest one to solve.


Well, as I said, Macrium made that transparent - I didn't even know
about the splitting; it was only when I happened to subsequently look at
the drive, and saw lots of 4G files instead of a single file, that I
realised it had done it.

Regards,
Rudy Wieser


John
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Listen, three-eyes, don't you try to out-wierd me, I get stranger things than
you free with my breakfast cereal. (Zaphod Beeblebrox in the link episode)
 




Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off






All times are GMT +1. The time now is 06:26 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.