No, I’m not suggesting that we use a defunct waste management company to handle our network backups. But I need to find a better way of imaging a 500gb disk.
Plan A: cat, gzip, and nfs:
Horde 4# cat /dev/sda1 | gzip > /net/dustpuppy/export/home/homelan/tmp/AlienwareVistaC.gz
Four hours later, it was barely 50gb into a 500gb disk.
So the Brute Force and Ignorance way of doing things doesn’t work that well. The next plan is to mount the partition read-only and use tar over ssh. I don’t know if this would have any better results but it’s probably better than catting through gzip to an nfs share.
UPDATE: The BFI way worked. The only difference was that I ran it in bash instead of csh. But that wasn’t what made it work. It turns out that only 80GB on that 500GB disk was used. Combined with the gzip compression, I managed to squeeze it down to about 65gb on disk. So I let it run overnight and this morning it was done.