A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Microsoft Windows 7 » Windows 7 Forum
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Once again, Google proves it's bought out.



 
 
Thread Tools Rate Thread Display Modes
  #16  
Old October 7th 17, 05:19 AM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Once again, Google proves it's bought out.

wrote:
On Fri, 06 Oct 2017 21:49:40 -0400, Paul wrote:

wrote:
On Fri, 06 Oct 2017 12:55:40 -0400, Paul wrote:

wrote:
My failed HDD is a Maxtor, so I went to Google and searched for "Maxtor
hard drive repair software". Almost half of the search results were for
EaseUS data recovery software. (Even claiming to be the official
software for Maxtor drives). EaseUs may or may not be worthwhile, but
it's very costly, and their so called "free" data recovery software is
limited to 2gb of recovery. It's becoming more and more apparent that
google has sold out to companies who pay them to the most, and EaseUs
must be one of those companies.

I guess it's time to find another search engine and dump google into the
nearest **** hole.

That's because Google knows, by the time you locate such
a software, the Maxtor drive will have died by then, and
no longer be accessible :-(

It's ddrescue ("gddrescue") package now... or forget it.
Kiss that data goodbye. I have the dead 40GB Maxtor drive(s)
here to prove it.

*******

I also still have Maxtor drives (in the old computers) that
still work. I even have a Quantum Fireball in one of the
machines. I always liked the marketing name for that thing.
As the description neatly covers anything bad that can
ever happen ("you didn't know it was a Fireball?").

Marketing is an art.

(PowMax power supply)

https://www.newegg.com/Product/Produ...82E16817163018

https://images10.newegg.com/NeweggIm...163-018-06.JPG

Now, if one of those catches fire, the company can say
"didn't you see our promotional material ? It's suppose
to work that way".

Paul
It's not gonna die any more than it is now. It's not plugged in. My
other HDD does all the booting. In fact I already have my new HDD
installed I backed up the two good partitions on that bad drive, (H: and
I. I backed them up twice. Later tonite, I plan to copy my H: amd I:
partitions data to the new drive.

I dont plan to plug in that bad drive anymore. If I have to rely on
Linux, I will rather pay a professional to save my data, before having
to buy a DVD burner and all of that, and then finding out I cant get
Linux to work. I did do one thing, I booted my XP machine with a
bootable flash drive that I have with an old version of Pc-Linux on it.
That is the only Linux that has ever sort of worked for me and made
sense as well, since it looks like Windows. Anyhow, I had that bad drive
plugged in, and Linux showed all the folders in gibberish, and clicking
on them gave me an error. XP and Win2000 both refuse to access that
partition at all, just tells me to format it. Win98 still sees it, but
over half the folders are missing.

Fortunately, I moved a large chunk of the data that was on the bad
partition (G to my I: partition a few months ago. I forgot that I did
that. I did it because G: was getting to full. Comparing to my old
backups, I am probably only lacking about 6gb. I dont know how much they
charge to save data, per gb, but that really reduces what needs to be
saved. Mostly just the one large folder with many sub folders. But there
are 2 other folders that have some new data. Everything else on that
partition is on my old backups.

Now, to find a reliable data recovery service......

You don't need a DVD burner for Linux.

The Live Image can be copied to a USB stick. The latest
versions (last several years) are called "hybrid", and they
happen to boot on legacy BIOS and UEFI BIOS, and the image
was designed so that if copied sector by sector, on top of
a USB stick, you plug in the USB stick and it boots.

Of the seven USB flash sticks sitting in front of me,
two of them contain a Linux ISO and nothing else. And when
Windows makes a file hard to delete, I just pop one of those
into the back of the machine, and in a couple of minutes,
job is done.

Machines before around 2005 or so, don't have USB boot,
and in that case, a CD would work. I even have a machine
only a couple years older than that, where the BIOS doesn't
know how to boot from a DVD drive, and can only boot
from a CD drive (it won't even flash the LED on the
DVD drive). And then the Linux images can only
be 700MB (CD size limit). And they haven't been that
small for some time. Currently mainstream Linux ISOs
are 1.6GB or so. I'd have to use Puppy Linux to get one
on a CD small enough.

This is an example of the Puppy family, only with perhaps
a newer kernel on it. For example, actual Puppy discs, have
no drivers at all for my most modern PC (the Puppy might
use a 2.6 kernel or so, when the current kernel is
around 4.4). The Fatdog one would be a little bit more modern.
This one boots on 64-bit processors.

http://distro.ibiblio.org/fatdog/iso/

Fatdog64-710.iso 2016-Dec-03 10:04:55 360MB

But you would only mess with that one, if absolutely
desperate.

There is little profit in really old computers now. You
gotta know a lot to beat some function out of them. And
if you've never done stuff like this on a six year old
machine, using a twenty year old machine is not the place
to start. There are just too many gotchas. I *like*
old computers, but nobody else in the industry does.
That's why they put SSE2 requirements in all the
software, the dumbasses. To get that old machine to
boot, I had to *install* Linux on a hard drive on a
modern machine, then move the drive to the old
computer. And then it would boot. I never expected
to have that much trouble, but that's what happens
when you dig some old gear out of the junk pile.

*******

Now, because I didn't really help you at all there,
here's a link to get you started. This will help you
get that downloaded ISO onto the USB stick, with
little to no sweat.

https://www.techspot.com/downloads/6062-rufus.html

Bon appetit,

Paul


Just for the heck of it, I plugged that drive into my newest XP machine
as a slave, and am running Recurva. I wont let it write to the drive,
but I want to see if I can pull any data off. So far I've waited over an
hour for it to do anything, and is still trying.

I still have an urge to let Norton Utilities (Disk Doctor) do what it
wants to do to the FAT, but it seems risky. Yet that software was made
for that era system and OS.

I have two bootable USB sticks. One is Puppy, the other is an older
version of PC-Linux. Puppy has gotten me out of a bind several times
when files get "stuck". Better yet, when I had a motherboard die and was
unable to transplant the HDD to another machine (XP), I was able to get
my data off the drive. (Win98 drives can be transplanted quite easily).

I do like PC-Linux better though. The layout is almost just like
Windows. I could learn to use that OS almost on a full time basis. (But
only that older version, they dont even make a 32 bit version anymore).
That's too bad too, because they had a decent product.

Yea, I dont understand why all the new linux need to be so huge.

Some years ago, I used some sort of program to make those bootable flash
drives. Unibooten or something like that. I should still have that
software, but aside from those 2 sticks, I quit trying linux. All the
big ones would never boot. However, even PC-Linux wont boot on the
default, I have to use the VESA mode, so I assume the problem is video
related. I recall trying Mint, which caused the whole computer to
freeze.

I did download ddrescue in a zip. It installed on XP and dont seem to do
anything except make a graph on the screen. I thought it was the ISO
file, so I am not sure where to get that. Plus I cant do it on dialup
anyhow. I know some linux you can buy a CD or DVD, but not that one from
what I saw on Distrowatch. Then again, I dont think my drive is for DVDs
anyhow, just CDs, and it's only a reader.

Can ddrescue be made bootable on a USB stick? Can I use that Unebooten?
(Of course first I will need to get the proper ISO.

In many ways, I think I should just get a pro to do this for me. Loosing
that data will really screw up my home business, and before I spend a
bunch of money on software or DVD machines, or whatever, I'm willing to
pay some company knowing I have a better chance of getting my data back.

All I do know, is fron now on, I will backup more than twice a year. The
problem is that with my old computer, I only have USB 1, and that takes
up to 3 days to back up all 7 partitions. I do it overnight, but it's
still real slow. However, pluggig this bad drive into my much newer XP
machine, I was able to backup H: and I: in less than a half hour. My E:
partition is 70gb and that one takes around 16 hours to backup using USB
1. That's the worst one.

Thanks for trying to help! I'm no geek, so a lot of this stuff goes
right over my head....



If your older machine has a PCI card slot, you can
get a USB2 card for around $10. Depending on the
whims of the sellers.

You can also get a USB3 card for PCI, but they are
quite expensive. Could be as much as $100 for one of those.
The way those work, is the card has two chips (big deal, right).

One chip converts PCI bus to PCI Express. The PCI bus is
limited to around 100MB/sec of practical transfer rate.
Next, a PCI Express USB3 chip is connected. The card
looks like this.

--- PCI to PCI Express ------- PCI Express to USB3 --- (two or four connectors)

The parts cost for that might be $5 for the first chip
and maybe $5 for the second chip. But because they
don't make very many of the cards, the cards end up
with quite high prices. Really too expensive to be
slapping into old computers (which is the intention
when building cards like that).

The normal high volume cards are like this.

--- PCI Express to USB3 --- (two or four connectors)
for x1 slot

and might cost $20-$25 or so. But if your old
computer doesn't have that x1 slot, and is too old to
have had PCI Express, things get a *lot* more expensive.

Your options:

USB 1.1 - 1MB/sec practical transfer rate (my Mac G4 does this!)

USB 2.0 - 30MB/sec practical transfer rate
- readily available as a PCI card, and cheap

USB 3.0 - 100MB/sec as a PCI card
- 400MB/sec+ as a PCI Express x1 Rev2 card

Examples:

1) The USB3 for PCI (100MB/sec) card. This one has a floppy
power connector, for an older computer. The aux power is a good
idea if using the ports for charging, or known large electrical
loads. I don't have the aux power connected on mine. The aux
power on mine, is a 15 pin SATA power connector (older power
supplies don't have that connector, and you need a molex to
sata cable to handle that case). Don't panic, if you select
a card that only has the 15 pin power, as you can buy an adapter
cable to drive it. Since this card also has the floppy connector,
your PSU may have a floppy connector to use instead.

PCI to USB3 four port $62
https://www.newegg.com/Product/Produ...82E16815158361

*Don't* buy one of those, without driver support. The
driver support list is:

Windows XP(32/64-bit)
Server 2003(32/64-bit)
Server 2008 R2
Vista(32/64-bit)
Win7(32/64-bit) --- OS needs driver, no native USB3 support
Win8(32/64-bit) --- OS should have driver already

2) PCI to USB2 four port $13
https://www.newegg.com/Product/Produ...9SIA1JM2F46982

Windows XP/Vista/7/8/8.1, Linux, Mac OS

It's possible the OS has the driver already, if you have
at least WinXP SP3. I suspect there's no driver CD in the box,
as it's not in the picture.

When you add cards like that, you cannot boot from them.

Your USB 1.1 port, I'm not aware of any USB 1.1 ports
that booted. Booting from USB became a feature in the
USB2 era. If a machine has USB2 ports in the I/O plate
area, the odds are a lot better that those will boot
a USB key.

HTH,
Paul
Ads
  #17  
Old October 7th 17, 06:11 PM posted to alt.windows7.general
No_Name
external usenet poster
 
Posts: 30
Default Once again, Google proves it's bought out.

On Sat, 07 Oct 2017 00:19:04 -0400, Paul wrote:

If your older machine has a PCI card slot, you can
get a USB2 card for around $10. Depending on the
whims of the sellers.

You can also get a USB3 card for PCI, but they are
quite expensive. Could be as much as $100 for one of those.
The way those work, is the card has two chips (big deal, right).

One chip converts PCI bus to PCI Express. The PCI bus is
limited to around 100MB/sec of practical transfer rate.
Next, a PCI Express USB3 chip is connected. The card
looks like this.

--- PCI to PCI Express ------- PCI Express to USB3 --- (two or four connectors)

The parts cost for that might be $5 for the first chip
and maybe $5 for the second chip. But because they
don't make very many of the cards, the cards end up
with quite high prices. Really too expensive to be
slapping into old computers (which is the intention
when building cards like that).

The normal high volume cards are like this.

--- PCI Express to USB3 --- (two or four connectors)
for x1 slot


Years ago, I installed a USB2 card in this computer. It lasted several
years, then it just quit working. Two years ago, I bought another one.
That one lasted about one year, then that one quit working. I have just
given up on them since. The USB2 is a lot faster than the 1.1. In fact I
dont see any reason to go to USB3. I think USB2 is fine.

That second PCI card is still in the computer, but it does nothing now.
I have to always boot into Win 2000 to utilize any USB capabilities. I
have a couple 2gb flash drives that Win98 recognizes, thats all. I use
those flash drives to transfer small files.

Half of a miracle occurred. I plugged my bad drive into my XP machine,
using the adaptor that I bought for USB to Harddrive. I ran Recurvia on
it. It took that software nearly 2 hours to go thru that partition. When
it finished, I was very disappointed. Out of about 23gb of data on the
drive, Recurvia found 63 files, out of which only 50 some were
recoverable. All of them were small .JPGs or text files.

That was a waste of time.....

But then I plugged that bad drive back into my Win98 machine, and ALL
the folders came back. Every folder that was on that partition is now
back and I can go from folder to sub folder and see all the files. That
allowed me to go thru everything and determine which folders are on my
old backup, and which ones are newer. How that folder structure
returned, is beyond me, but that was good news. But it's not all good. I
began copying everything that's not backed up, to my C: partition. I can
only save about 60% of the files, and have to copy one file at a time,
while manually making folders. For SMALL files, I can save about 4 out
of 5, but for large files it's less than 50%.

I've spent hours copying files, and have many more hours to go, but I
will save what I can. After that, I have to decide if I want to get a
pro data recovery business, or do something more drastic on my own.

How the folders came back from the dead is beyond me. I can only think
that Recurvia brought them back when it scanned the drive, or else it
has something to do with my removal of all the files on the H: and I:
partitions, after backing them up twice. Oddly enough, XP sees that bad
G: partition as unformatted. Windows 2000 does the same, but Win98 is
seeing it as a valid partition with a lot of data. I just cant copy much
of the data.



  #18  
Old October 7th 17, 08:12 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Once again, Google proves it's bought out.

wrote:
On Sat, 07 Oct 2017 00:19:04 -0400, Paul wrote:

If your older machine has a PCI card slot, you can
get a USB2 card for around $10. Depending on the
whims of the sellers.

You can also get a USB3 card for PCI, but they are
quite expensive. Could be as much as $100 for one of those.
The way those work, is the card has two chips (big deal, right).

One chip converts PCI bus to PCI Express. The PCI bus is
limited to around 100MB/sec of practical transfer rate.
Next, a PCI Express USB3 chip is connected. The card
looks like this.

--- PCI to PCI Express ------- PCI Express to USB3 --- (two or four connectors)

The parts cost for that might be $5 for the first chip
and maybe $5 for the second chip. But because they
don't make very many of the cards, the cards end up
with quite high prices. Really too expensive to be
slapping into old computers (which is the intention
when building cards like that).

The normal high volume cards are like this.

--- PCI Express to USB3 --- (two or four connectors)
for x1 slot


Years ago, I installed a USB2 card in this computer. It lasted several
years, then it just quit working. Two years ago, I bought another one.
That one lasted about one year, then that one quit working. I have just
given up on them since. The USB2 is a lot faster than the 1.1. In fact I
dont see any reason to go to USB3. I think USB2 is fine.

That second PCI card is still in the computer, but it does nothing now.
I have to always boot into Win 2000 to utilize any USB capabilities. I
have a couple 2gb flash drives that Win98 recognizes, thats all. I use
those flash drives to transfer small files.

Half of a miracle occurred. I plugged my bad drive into my XP machine,
using the adaptor that I bought for USB to Harddrive. I ran Recurvia on
it. It took that software nearly 2 hours to go thru that partition. When
it finished, I was very disappointed. Out of about 23gb of data on the
drive, Recurvia found 63 files, out of which only 50 some were
recoverable. All of them were small .JPGs or text files.

That was a waste of time.....

But then I plugged that bad drive back into my Win98 machine, and ALL
the folders came back. Every folder that was on that partition is now
back and I can go from folder to sub folder and see all the files. That
allowed me to go thru everything and determine which folders are on my
old backup, and which ones are newer. How that folder structure
returned, is beyond me, but that was good news. But it's not all good. I
began copying everything that's not backed up, to my C: partition. I can
only save about 60% of the files, and have to copy one file at a time,
while manually making folders. For SMALL files, I can save about 4 out
of 5, but for large files it's less than 50%.

I've spent hours copying files, and have many more hours to go, but I
will save what I can. After that, I have to decide if I want to get a
pro data recovery business, or do something more drastic on my own.

How the folders came back from the dead is beyond me. I can only think
that Recurvia brought them back when it scanned the drive, or else it
has something to do with my removal of all the files on the H: and I:
partitions, after backing them up twice. Oddly enough, XP sees that bad
G: partition as unformatted. Windows 2000 does the same, but Win98 is
seeing it as a valid partition with a lot of data. I just cant copy much
of the data.


On the *Destination* drive, you can gather up a collection
of freshly copied folders and compress them. Use 7Z Ultra for
example. That's how you can make storage space go further.
Note that, the "technician computer" you're using, must be
"healthy" and not known for RAM errors or crashing or
instability. You need a perfectly stable computer, to trust it
to compress files. *Don't* compress your only good copy of
a file, on a known-to-be-flaky computer.

https://en.wikipedia.org/wiki/7zip

Downloads

http://7-zip.org/

For example, on some of my backup drives, I can
save 500GB of space, simply by compressing the
Macrium Reflect MRIMG files. I only do that in cases where
I know I won't be referring to those files for some
time. So my "regular" "comprehensive" backups
get compressed. The "spur of the moment" backups
are left uncompressed, or they're deleted when
I need the space. If I delete a file by accident,
I look to the "spur of the moment" backups first,
to see if the file is there.

To really work with compressed content, you need
at least one good-sized spare (extra) hard drive, to give
you space to work. Obviously, if you compress the ****
out of groups of folders, a group at a time, they're
not going to fit on the drive if you decompress them
all at the same time.

And it takes a *long* time, to do Ultra compression.
You can use GZIP compression or WINZIP compression,
as a good space/time tradeoff. Ultra is when you're
squeezing the last drop out of your drive. WinRAR
is similarly capable to 7ZIP, except it's commercial,
and I don't know what it costs. WinRAR decompression
is likely "free", so archives are never trapped in
the format. But compression is likely to cost a
few bucks.

To compress an entire hard drive, on my multi-core
computer, takes all day (24 hours), just to give you
an idea how long you could wait when doing really
really large folders of stuff. The Test Machine
does 7Z at 18MB/sec, whereas the machine I'm
typing on can only manage 2-3MB/sec. Just to give
some ballpark numbers for the slowest methods.

*******

If you enable NTFS compression on the target drive,
the OS can compress files when you write them on the
target drive. But the degree of compression is poor.
Even WinZIP or GZIP achieves better compression than
NTFS compression. Nevertheless, I've used it. When I was doing
a Google Chrome build on a small drive here, I enabled
NTFS compression just so it wouldn't run out of space.
And when the build finished, there was only 3GB of space
left, so the compression paid off. The build would have
bombed with "out of space", if I hadn't enabled NTFS
compression in that case.

NTFS compression (equivalent to LZ4 maybe, fast, not efficient)

GZIP/WinZIP (better compression, a bit slower)

7ZUltra or WinRAR (best compression, really slow)

The one you use, depends on your degree of
desperation :-) I would not recommend extensive
compression runs on a lowly P4, as the wait could
drive you nuts. Even my current machine, that does
2-3MB/sec compression, that's pretty hard to take.

A ThreadRipper for a thousand bucks, that's what you
want for compression. But hardly anybody can afford
stuff like that.

And if you want multithreaded GZIP, there is a
compressor called "PIGZ". But the fit and finish
isn't perfect on it - the header info needs a
slight tweak, and the port never had that fixed on
it. It's possible the Linux version of PIGZ has
the header fixed. The GZIP compression option in
the 7ZIP package, is only single-threaded. If you have
a single core P4 without Hyperthreading, then these
distinctions don't matter, and PIGZ and GZIP run
at the same relative speed.

*******

So how do you do compression, anyway ? A good question.

Source folders (select and "Add to Archive)

Source --- 7ZIP --- blah.7z output

When the compression step is complete, you delete
the Source folder.

xxx --- 7ZIP --- blah.7z output

Now you have room to copy some more stuff off the
other drive, onto the recovery drive. When you have
another batch ready, you can compress those.

Source2 --- 7ZIP --- blah2.7z output
xxx --- 7ZIP --- blah2.7z output

By inching along that way, you may be able to squeeze
the whole archive onto your only output drive. When
a set of folders is compressed, you delete the source
folder. Note that, if you right click a .7z file,
7ZIP has a "Verify" or "Test" option, and that verifies
the checksum. And can tell you if something untoward
happened. It cannot detect all hardware failure
conditions, but it can sometimes help if you
suspect trouble. *Don't* delete the Source, until
you're feeling good about the "blah.7z" and its
integrity.

*******

A better answer, is having a good drive to use
in the first place. Doing recovery onto a too-small
drive, isn't that much fun. And I'm sure you've taken
that into consideration when buying drives for your
setup. Not every computer is going to handle the
latest and greatest stuff, and even I have budget
limits. I haven't bought one of those 50TB SSD
drives yet :-) They probably cost as much as my
house.

Paul
  #19  
Old October 7th 17, 10:02 PM posted to alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Once again, Google proves it's bought out.

In message , Paul
writes:
wrote:

[]
the folders came back. Every folder that was on that partition is now
back and I can go from folder to sub folder and see all the files. That
allowed me to go thru everything and determine which folders are on my
old backup, and which ones are newer. How that folder structure
returned, is beyond me, but that was good news. But it's not all good. I
began copying everything that's not backed up, to my C: partition. I can
only save about 60% of the files, and have to copy one file at a time,
while manually making folders. For SMALL files, I can save about 4 out
of 5, but for large files it's less than 50%. I've spent hours
copying files, and have many more hours to go, but I
will save what I can. After that, I have to decide if I want to get a
pro data recovery business, or do something more drastic on my own.
How the folders came back from the dead is beyond me. I can only

[]
G: partition as unformatted. Windows 2000 does the same, but Win98 is
seeing it as a valid partition with a lot of data. I just cant copy much
of the data.


On the *Destination* drive, you can gather up a collection
of freshly copied folders and compress them. Use 7Z Ultra for
example. That's how you can make storage space go further.


I don't think shortage of space for rescued files is anonymous's main
problem at the moment.
[]
For example, on some of my backup drives, I can
save 500GB of space, simply by compressing the
Macrium Reflect MRIMG files. I only do that in cases where


Out of curiosity, what compression do you let Macrium itself do? From
what I remember, Macrium 5 offers a choice of something like no
compression, moderate compression (recommended), and high compression. I
tend to use none. I suspect later versions of Macrium haven't changed
much in this area.
[]
Anonymous: how are you remembering which files have been saved
successfully (either now, or because they were backed up previously)?
I'd be tempted to delete them (and any folders that are then empty) from
the flaky drive, so that only the ones still to be rescued are still
visible; however, that would involve writing to the flaky drive
(deleting is just modifying folder data), which is generally a Bad Idea.
[It's what I did when I had a flaky drive, though - well, I did move
rather than copy, which of course does a copy then a delete if the copy
was successful.]

For the hard-to-read files, I'd be tempted to seek - or write -
something which opens them, then copies them byte by byte (to a file on
a good drive) until a read error occurs; that way you'd have at least
part of the file, which may or may not be usable. (Ideally, something
which then carries on after the bad patch, maybe writing blanks to the
copy for the unreadable bytes - so that the copy at least is the same
size and has the tail, which for some filetypes - I think .zip is one -
is where important information about the contents is.)

Paul (or anyone else) - do you know of any such utility? [Preferably not
involving command lines, either in Windows or Linux (-:!] I did (a long
time ago - I think in BBC BASIC!) write one that did the first part
(copy byte by byte until error), but not beyond. [IIRR, BBC BASIC closed
the input file when there was an error reading.]
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Radio 4 is one of the reasons being British is good. It's not a subset of
Britain - it's almost as if Britain is a subset of Radio 4. - Stephen Fry, in
Radio Times, 7-13 June, 2003.
  #20  
Old October 7th 17, 10:24 PM posted to alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Once again, Google proves it's bought out.

In message , "J. P. Gilliver
(John)" writes:
[]
For the hard-to-read files, I'd be tempted to seek - or write -
something which opens them, then copies them byte by byte (to a file on
a good drive) until a read error occurs; that way you'd have at least
part of the file, which may or may not be usable. (Ideally, something
which then carries on after the bad patch, maybe writing blanks to the
copy for the unreadable bytes - so that the copy at least is the same
size and has the tail, which for some filetypes - I think .zip is one -
is where important information about the contents is.)

Paul (or anyone else) - do you know of any such utility? [Preferably
not involving command lines, either in Windows or Linux (-:!] I did (a
long time ago - I think in BBC BASIC!) write one that did the first
part (copy byte by byte until error), but not beyond. [IIRR, BBC BASIC
closed the input file when there was an error reading.]


Actually, I've just remembered: IrfanView will do the first part - read
up to error - and then let you save; however, it may only be for certain
types of file (I think it does for JPEG images, for example; it fills
from the failure to the end with grey).
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

The smallest minority on earth is the individual. Those who deny individual
rights cannot claim to be defenders of minorities. - Ayn Rand, quoted by Deb
Shinder 2012-3-30
  #21  
Old October 7th 17, 11:09 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Once again, Google proves it's bought out.

J. P. Gilliver (John) wrote:


For the hard-to-read files, I'd be tempted to seek - or write -
something which opens them, then copies them byte by byte (to a file on
a good drive) until a read error occurs; that way you'd have at least
part of the file, which may or may not be usable. (Ideally, something
which then carries on after the bad patch, maybe writing blanks to the
copy for the unreadable bytes - so that the copy at least is the same
size and has the tail, which for some filetypes - I think .zip is one -
is where important information about the contents is.)

Paul (or anyone else) - do you know of any such utility? [Preferably not
involving command lines, either in Windows or Linux (-:!] I did (a long
time ago - I think in BBC BASIC!) write one that did the first part
(copy byte by byte until error), but not beyond. [IIRR, BBC BASIC closed
the input file when there was an error reading.]


That's ddrescue, made available as "gddrescue" package on Linux.
Near the bottom of this page. It's used to clone entire, damaged
hard drives. You can "clone to an image file" or "clone to the
physical disk". If cloning to physical disks, the destination
has to be the same size, or be a larger disk, so the end of
the source disk is not chopped off by accident.

http://www.cgsecurity.org/wiki/Damaged_Hard_Disk

sudo apt install gddrescue # Get the package

which ddrescue # Check it is present

# first, grab most of the error-free areas in a hurry:
# Make sure ~ (your home directory), has enough storage space for logging.
# The "rescued.log" keeps track of which sectors are read and copied OK.

sudo ddrescue -n /dev/sda /dev/sdb ~/rescued.log

# then try to recover as much of the dicey areas as possible:

sudo ddrescue -r 1 /dev/sda /dev/sdb ~/rescued.log

The /dev/sda is the old disk.

The /dev/sdb is the new disk (same size of larger).

You can use

sudo gparted

and use that to view some information about the disks,
and get some idea whether yours is sda or sdb or whatever.

I'd provide a link to a Windows one, but I'm not convinced
one exists. The dude who wrote this one, knows exactly how
to port (or combine) the source files, and that person
would be the perfect person to write one for us. But
this one is not equipped to keep track of what was done,
or to do retries like the other one does. But at least this
person has figured out the namespace for low level
storage access (whatever it is).

http://www.chrysocome.net/dd

The chrysocome program only has one bug I know of. If
cloning a USB stick to something else, the program does
not reliably detect "the end" of the device. It's advised
to use a blocksize parameter and count, to tell the program
exactly when to stop copying. If your source USB is 1GB,
then you set a blocksize and count that transfers that
exact amount.

*******

The ddrescue program also has the ability to copy using a dynamic
block size. In essence, it knows it has to transfer the
whole disk, and it will vary the size of the read command
until it's snagged the whole source disk. That's unlike
how the chrysocome (Windows) one works.

This is an example from my notes. This is a backup to
an image file, rather than a low-level disk-to-disk clone.
The "sdb.raw" file, stored somewhere, will be the same
size as the source disk. If the sdb.raw file has any
512 byte blocks filled with zeros, they will take no
storage space on the destination file system. That's
the "sparse" notation, the "-S". But such a technique,
is reserved for situations where the disk owner sweeps
the white space on the drive with sdelete or similar,
causing blocks full of zeros to appear in any area where
no files are stored. If you fill the white space with
zeros, it allows archival copies made this way, to take
less space. "Sparse" is a feature of the NTFS file system,
and there's even a utility for interacting with sparse files.
The reason this is in my notes file, is I actually ran this.

ddrescue -S -b8M /dev/sdb /mount/external/backup/sdb.raw /mount/external/backup/sdb.log

The -b8M says "you can use block transfers as large as 8MB
if you feel like it". The actual program will not issue
a size that big, as the hard drive will tell it to get
lost if it tries. Disks have a limit on the size of
individual commands. And ddrescue has a way to "probe"
the supported size. In a way, my using that value is
a way of saying "unlimited... use the biggest one that
works" :-) Since my disk was not damaged, the contents
of the sdb.log file later were rather small.

Paul
  #22  
Old October 7th 17, 11:30 PM posted to alt.windows7.general
No_Name
external usenet poster
 
Posts: 30
Default Once again, Google proves it's bought out.

On Sat, 7 Oct 2017 22:02:40 +0100, "J. P. Gilliver (John)"
wrote:

Anonymous: how are you remembering which files have been saved
successfully (either now, or because they were backed up previously)?
I'd be tempted to delete them (and any folders that are then empty) from


I wrote down all root folders, compared them to my backup, and marked on
the paper which I dont have to mess with, because they are backed up.

There are two main folders that need to be saved. One just has a series
of recent folders to save. I wrote down which ones I have to save, then
tried to copy them. If it copied good, I marked "OK", if only partial
copy, I wrote "Part". A few copied nothing, on those I wrote "Bad".

the flaky drive, so that only the ones still to be rescued are still
visible; however, that would involve writing to the flaky drive
(deleting is just modifying folder data), which is generally a Bad Idea.
[It's what I did when I had a flaky drive, though - well, I did move
rather than copy, which of course does a copy then a delete if the copy
was successful.]


By accident I wrote to that bad partition twice now. It never fails,
when I copy stuff, I accidentally hit the wrong key and it makes a file
in the same folder called "copy of filaname.xxx". I just deleted them
with no effect. I thought about deleting all the stuff that is saved,
but I dont think that is a good idea. I did delete everything from the
other two good partitions on the drive though. Then I formatted them
too. I was thinking of seeing if I could copy the bad partition to a
good one, but I think it's better to copy to another HDD entirely.


For the hard-to-read files, I'd be tempted to seek - or write -
something which opens them, then copies them byte by byte (to a file on
a good drive) until a read error occurs; that way you'd have at least
part of the file, which may or may not be usable. (Ideally, something
which then carries on after the bad patch, maybe writing blanks to the
copy for the unreadable bytes - so that the copy at least is the same
size and has the tail, which for some filetypes - I think .zip is one -
is where important information about the contents is.)


Most of the large files that I cant save, are .PDF manuals and
electronic schematics. Part of a PDF is worthless. On the other hand,
part of a MP4 will play and part of a JPG can usually be viewed.


Paul (or anyone else) - do you know of any such utility? [Preferably not
involving command lines, either in Windows or Linux (-:!] I did (a long
time ago - I think in BBC BASIC!) write one that did the first part
(copy byte by byte until error), but not beyond. [IIRR, BBC BASIC closed
the input file when there was an error reading.]
--


Yea, I wish there was something easier to use. I read a bunch of
websites and it appears there are several different versions of ddrescue
and gddrescue. I cant even find out where to download it, and know it's
the right one. Of course when it comes to linux, it seems there are
always too many versions and everyone claims theirs is the best. (One
reason I dont like linux).

It shows that gddrescue is a graphical menu of ddrescue, but on other
sites it claims the "G" only means "Gnu". A URL of the place to download
the GOOD ONE" would be appreciated, and hopefully the one with the GUI
front end.

J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf


In all honesty, I dont really think this drive is dying. Yes, it has a
bad sector and unfortunately it appears that bad sector is in the FAT
tables. I will replace the drive though, but I dont think it's dying by
the minute. I really think Scandisk caused this mess. A power outage
caused the computer to shut off while I was using it, and many of my
partitions required running scandisk because they were showing the wrong
size. But that happens faily often and has never been a problem. I
should mention that I defrag at least twice a week, and the data should
not be fragmented. If somehow I can get the FAT tables repaired, I may
have a usable partition again.

I'm still tempted to let Norton Disk Doctor do what it's sugggesting,
which is to repair the boot record and copy the backup version of the
FAT, but before I do that, I want to save as much data as I can, and
hopfully clone the partition too. It's now obvious that all my data is
there, but the FAT record is not showing or accessing it properly, thus
the whole problem is in the FAT, not the data itself.

I may take the drive to a local computer repair shop and see if they
have some software to fix it, or should I get brave and let Norton disk
doctor do it's thing??? This sure is a gamble!!!!


  #23  
Old October 7th 17, 11:35 PM posted to alt.windows7.general
No_Name
external usenet poster
 
Posts: 30
Default Once again, Google proves it's bought out.

On Sat, 7 Oct 2017 22:02:40 +0100, "J. P. Gilliver (John)"
wrote:

I don't think shortage of space for rescued files is anonymous's main
problem at the moment.
[


Disk space is not a problem. I have two 120gb drives and each one is
about half full. I copy all my movies, videos, and music to external
plug in drives. The only reason my G: partition WAS getting full, was
because I have in PDF format, nearly every electronics magazine from the
1940s thru 2000. That was using a lot of space, so I moved it to my I:
partition (and was planning to move it to my portable drive soon). I'm
real happy I moved that to I:, or I would have lost all of that too.

  #24  
Old October 8th 17, 01:03 AM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Once again, Google proves it's bought out.

wrote:

A power outage caused the computer to shut off while I was using it


FAT32 is not protected against power outages.

NTFS has a USN Journal, that allows playback during
boot-time repair, and file fragments can be tossed at
that point in time (your open Word document). The result
on NTFS is, the rest of the file system is safe, and
relatively bulletproof.

Some of Microsoft's later OSes, also do background checks
on NTFS, which reduces the odds of some latent faults
from being stored permanently.

But FAT32 has no real protection as such. If you catch
it in the middle of updating the FAT, what's it going
to do ? There's no journal to help it later... FAT32
plus power failure = accident_waiting_to_happen.

Have you ever considered adding a UPS to the computer room ?
Mine doesn't stay up very long, but I get a chance to do a
shutdown, before the battery flakes out on it.

The machine with the Win10 on it, you never know how long
it's going to take to finish shutdown, and I've had a close
shave or two with it (the battery lasted long enough for
it to finish spinning the Juggler Balls at me). I was very
tempted to just hit the power button on the Win10 machine
instead.

Paul
  #25  
Old October 8th 17, 09:17 AM posted to alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Once again, Google proves it's bought out.

In message ,
writes:
On Sat, 7 Oct 2017 22:02:40 +0100, "J. P. Gilliver (John)"
wrote:

Anonymous: how are you remembering which files have been saved
successfully (either now, or because they were backed up previously)?
I'd be tempted to delete them (and any folders that are then empty) from


I wrote down all root folders, compared them to my backup, and marked on
the paper which I dont have to mess with, because they are backed up.

There are two main folders that need to be saved. One just has a series
of recent folders to save. I wrote down which ones I have to save, then
tried to copy them. If it copied good, I marked "OK", if only partial
copy, I wrote "Part". A few copied nothing, on those I wrote "Bad".


Ah, if it's a small enough number to do that, then that sounds good.
[]
By accident I wrote to that bad partition twice now. It never fails,
when I copy stuff, I accidentally hit the wrong key and it makes a file
in the same folder called "copy of filaname.xxx". I just deleted them
with no effect. I thought about deleting all the stuff that is saved,


(So you've actually written to it more times - a deletion is a write, as
it rewrites a folder's information.)

Personally, from the way you've been describing how things are going, I
_would_ be tempted (once you've made _sure_ the copies are good) to
delete all that you've rescued successfully (and any then-empty
folders), so that it's immediately obvious what still needs to be
recovered; however, Paul and others would probably advise against it.

Any remaining ones you think might read OK, again I'd be tempted to
"move" rather than "copy", for the same reason (and others will say -
correctly - that's adding to the danger). But you've probably got all
the ones that are going to come easily by now - is that correct?

but I dont think that is a good idea. I did delete everything from the
other two good partitions on the drive though. Then I formatted them
too. I was thinking of seeing if I could copy the bad partition to a
good one, but I think it's better to copy to another HDD entirely.

Oliver Hardy It certainly is. /Oliver
[]
Most of the large files that I cant save, are .PDF manuals and
electronic schematics. Part of a PDF is worthless. On the other hand,
part of a MP4 will play and part of a JPG can usually be viewed.

Are there any "repair broken PDF" utilities out there? (I vaguely
remember ones that repair bad .zip files, or try to. Some years since,
and they may or may not handle some modern .zip formats.)
[]
Yea, I wish there was something easier to use. I read a bunch of
websites and it appears there are several different versions of ddrescue
and gddrescue. I cant even find out where to download it, and know it's


I'm sure Paul has - I think even in this thread, certainly in others -
mentioned this.
[]
In all honesty, I dont really think this drive is dying. Yes, it has a
bad sector and unfortunately it appears that bad sector is in the FAT


That does _sound_ to be the case.

tables. I will replace the drive though, but I dont think it's dying by
the minute. I really think Scandisk caused this mess. A power outage


Scandisk _can_ certainly make things worse. It was really (I think)
intended to clean up after a bad write, sorting out pointers and that
sort of thing, and to sort out (scan for, and reroute around if
possible) bad sectors on really old drives, of the sort that modern
(even your 120G) drives have on-board firmware that conceals them
anyway; I think there's a switch to it (OK, the dreaded command line,
but this is a relatively simple use thereof) that can tell it to scan
but just report, not change anything, and that's probably useful.

caused the computer to shut off while I was using it, and many of my
partitions required running scandisk because they were showing the wrong
size. But that happens faily often and has never been a problem. I


Hmm, I'm a bit concerned that it happens frequently - especially if it
affects partitions other than the one you're using at the time.

should mention that I defrag at least twice a week, and the data should
not be fragmented. If somehow I can get the FAT tables repaired, I may
have a usable partition again.


Yes, getting at the backup FAT (which you've asked about, but I don't
think anyone has answered - sorry, I can't) does sound tempting. Though
even if you get the partition back in that way, I'd at least: delete
(after copying) everything on it, then run scandisk again (with a switch
to lower its tolerance threshold, if there is one), to make _sure_ the
dodgy sector(s) is/are definitely avoided. Most would say don't use at
least that partition - probably disc - any more, except perhaps for
temporary storage (e. g. like a USB stick, for transferring between
computers).

I'm still tempted to let Norton Disk Doctor do what it's sugggesting,
which is to repair the boot record and copy the backup version of the
FAT, but before I do that, I want to save as much data as I can, and
hopfully clone the partition too. It's now obvious that all my data is
there, but the FAT record is not showing or accessing it properly, thus
the whole problem is in the FAT, not the data itself.


Why do you say that? I tend to suspect you're right, but I'm curious as
to why you think the data is there intact.

I may take the drive to a local computer repair shop and see if they
have some software to fix it, or should I get brave and let Norton disk
doctor do it's thing??? This sure is a gamble!!!!

Indeed. What _are_ these - mostly schematics in .pdf form, from what
you've said - that are so valuable/irreplaceable? (And why are they
almost impossible to replace: where have you been getting them from?)

--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Never rely on somebody else for your happiness.
- Bette Davis, quoted by Celia Imrie, RT 2014/3/12-18
  #26  
Old October 8th 17, 09:27 AM posted to alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Once again, Google proves it's bought out. [now NTFS journal file]

In message , Paul
writes:
wrote:

A power outage caused the computer to shut off while I was using it


FAT32 is not protected against power outages.

NTFS has a USN Journal, that allows playback during

[US Navy (-:?]
boot-time repair, and file fragments can be tossed at
that point in time (your open Word document). The result
on NTFS is, the rest of the file system is safe, and
relatively bulletproof.


I suppose I must concede to this, in that this machine (with the NTFS it
came with) has rarely given me any trouble (apart from when the discs
stopped going round! And even then I was able to rescue ~95% when I'd
unstuck it). But I really would like an explanation - perhaps a
Paul-style one - of _what_ this mysterious "journal" is about. So often,
I see "NTFS is better because it has a journal file" or something like
that, with no further explanation.
[]
Have you ever considered adding a UPS to the computer room ?
Mine doesn't stay up very long, but I get a chance to do a
shutdown, before the battery flakes out on it.


One of the reasons I mostly use laptops/netbooks now; they are in effect
a poor man's UPS, in that the battery - even if in poor condition -
usually has at least enough go to allow a controlled shutdown, often
even allowing the completion of what you were doing first, if it isn't
too complex. (At least, a save of a document you were editing or
whatever.)

The machine with the Win10 on it, you never know how long
it's going to take to finish shutdown, and I've had a close
shave or two with it (the battery lasted long enough for
it to finish spinning the Juggler Balls at me). I was very
tempted to just hit the power button on the Win10 machine
instead.


(Doesn't 10 hibernate rather than shut down anyway, by default, when you
click through shut down or hit the power button?)

Paul


--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Never rely on somebody else for your happiness.
- Bette Davis, quoted by Celia Imrie, RT 2014/3/12-18
  #27  
Old October 8th 17, 09:35 AM posted to alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Once again, Google proves it's bought out.

In message ,
writes:
On Sat, 7 Oct 2017 22:02:40 +0100, "J. P. Gilliver (John)"
wrote:

I don't think shortage of space for rescued files is anonymous's main
problem at the moment.
[


Disk space is not a problem. I have two 120gb drives and each one is
about half full. I copy all my movies, videos, and music to external
plug in drives. The only reason my G: partition WAS getting full, was
because I have in PDF format, nearly every electronics magazine from the
1940s thru 2000. That was using a lot of space, so I moved it to my I:
partition (and was planning to move it to my portable drive soon). I'm
real happy I moved that to I:, or I would have lost all of that too.

Wow, that's a _lot_ of data. (Especially if the .pdfs are - as I suspect
- scanned images rather than text-except-where-there-are-images.)

I'd have said that _was_ a candidate for dumping to CD/DVD (maybe by
year or decade, depending on how much data); it's not going to change.
(I think you may have said you don't have a burner, but even a portable
slimline one [for use with the XP machine via USB] shouldn't cost more
than $15. Though finding an EIDE one for internal fitting to the '98
machine might be more of a challenge, as might finding burner software
that works with both '98 and the drive if you do find one.)
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Never rely on somebody else for your happiness.
- Bette Davis, quoted by Celia Imrie, RT 2014/3/12-18
  #28  
Old October 8th 17, 04:06 PM posted to alt.windows7.general
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Once again, Google proves it's bought out.

J. P. Gilliver (John) wrote:
In message ,
writes:


I may take the drive to a local computer repair shop and see if they
have some software to fix it, or should I get brave and let Norton disk
doctor do it's thing??? This sure is a gamble!!!!

Indeed. What _are_ these - mostly schematics in .pdf form, from what
you've said - that are so valuable/irreplaceable? (And why are they
almost impossible to replace: where have you been getting them from?)


In-place repair always has an element of danger.

Whether it's scandisk or CHKDSK, or whatever.

You should always make a copy of valuable materials
and work on the copy, *not* the known-damaged item.

I prefer a sector-by-sector copy, one where a log is
produced showing what sectors did not get copied. The
flavor of operation done, may be affected by the available
space you've got to work with. The first priority is
making the copy, just in case the disk is about to
mechanically fail on you.

Similarly, deleting files from the source, using "move"
of files from the source (causing deletion) is not
advised. The file system may attempt to modify some
part of the damaged FAT, on top of that bad sector.
You can do anything you want to the source disk
as long as:

1) The operation has no side effects on the source partition.

2) The commands you're executing when reading the disk,
don't throw the heads around too vigorously. That's
where the sector-by-sector copy comes in, as it smoothly
moves over the disk surface while working.

Paul

  #30  
Old October 8th 17, 08:24 PM posted to alt.windows7.general
No_Name
external usenet poster
 
Posts: 30
Default Once again, Google proves it's bought out.

On Sun, 08 Oct 2017 12:25:37 -0700, Ken Blake
wrote:

On Sun, 08 Oct 2017 13:26:19 -0500, Char Jackson
wrote:

On Sat, 07 Oct 2017 13:11:18 -0400, wrote:

Years ago, I installed a USB2 card in this computer. It lasted several
years, then it just quit working. Two years ago, I bought another one.
That one lasted about one year, then that one quit working. I have just
given up on them since. The USB2 is a lot faster than the 1.1. In fact I
dont see any reason to go to USB3. I think USB2 is fine.


I agree that USB2 is fine...for things like a keyboard or a mouse. I
certainly wouldn't want to use it to connect an external drive,
including a thumb drive.




I use USB2 to connect external drives all the time, since I don't have
any USB3 ports. But since I use external drives only for backup, and I
do other things (sometimes going to sleep) while the backup is
running, I really don't care how slow it is.


USB 2 can copy a rather large partition of mine in about a half hour.
Thats acceptable to me. USB 1.1 takes 16 or more hours for the same
drive. That's not acceptable, but when it's all I have, I use it to
backup and try to do it when I am sleeping and since that extends into
the next day, I try to do it when I know I wont need the computer that
next day.

I have never had any USB 3, so I dont know how long that would take, but
I do question if copying stuff as fast as they claim USB 3 is, I have to
question if the copy is safe and reliable.

It's just like driving. Driving 200 miles at 10mph is going to take very
many hours. It can be done, but it sure is slow. Driving at 60mph is
pretty safe and gets the trip done. Although driving at 140mph will get
you to your destination fast, it's dangerous and risky.

Given the choice, I'd choose the 60mph.

 




Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off






All times are GMT +1. The time now is 03:57 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.