[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[cobalt-users] Re:tar problems



At 3:40 PM -0500 12/3/01, Mike Fritsch is rumored to have typed:

> We have been working on a backup solution and have run into a problem
> tarring up /home/sites. It seems tar can only handle about 2 gig of data.
> Does anyone know of a command or way to tar up /home/sites into multiple 1.5
> gig archives?

   Er, I suppose something like:

#/bin/sh
tar -c /home/sites/home/ | gzip - > /home/backup/home.tar.gz
tar -c /home/sites/site1/ | gzip - > /home/backup/site1.tar.gz
tar -c /home/sites/site2/ | gzip - > /home/backup/site2.tar.gz
tar -c /home/sites/site3/ | gzip - > /home/backup/site3.tar.gz
tar -c /home/sites/site4/ | gzip - > /home/backup/site4.tar.gz
tar -c /home/sites/site5/ | gzip - > /home/backup/site5.tar.gz
...

   ...ad nausium, used as a cron job, wouldn't work? Add a new site, add a
new line to the script. (Although this isn't a _great_ solution, since it
puts the backups on the same filesystem, it's a starting point for a much
better shell script of your own...like, for example, changing the umask so no
one but root can read the tar files...)

         Charlie