Linux Backup script - Questions

This is a discussion on Linux Backup script - Questions ; Hi, I am using linux to backup files. I want a backup system that produces many 640 meg files from a huge dir. Then each file could be burnt to a CD. Manually splitting files is a pain. Also, If ...

+ Reply to Thread
Results 1 to 5 of 5

Thread: Linux Backup script

  1. Linux Backup script

    Hi, I am using linux to backup files.

    I want a backup system that produces many 640 meg files from a huge dir.
    Then each file could be burnt to a CD.

    Manually splitting files is a pain.

    Also, If one of the files dies then I need to be able to get the rest.
    If I use the split command I would need to append all the TAR's into the
    one file just to extract one file.

    The Ideal solution is to have a script that compresses a dir tree into
    foobar_001.tar.gz files each less then 640 meg, are there any
    scripts/progs that doo this?


  2. Re: Linux Backup script

    On Thu, 11 Sep 2003 08:39:47 +1000, Ideasman wrote:
    > Hi, I am using linux to backup files.
    >
    > I want a backup system that produces many 640 meg files from a huge dir.
    > Then each file could be burnt to a CD.
    >
    > Manually splitting files is a pain.
    >
    > Also, If one of the files dies then I need to be able to get the rest.
    > If I use the split command I would need to append all the TAR's into the
    > one file just to extract one file.
    >

    [...]

    Perhaps something like below would meet your needs:

    #to archive the /home tree with gzip xompression into parts
    #labled like TAR-PART-aa TAR-PART-ab ... in directory /huge
    #showing file names to indicate progress

    tar cvvzf - /home | split --bytes=640000000 - /huge/TAR-PART-

    #to restore

    cd / ; cat /huge/TAR-PART-* | tar xvvzf -




  3. Re: Linux Backup script

    On Thu, 11 Sep 2003 08:39:47 +1000, Ideasman wrote:
    > Hi, I am using linux to backup files.
    >
    > I want a backup system that produces many 640 meg files from a huge dir.
    > Then each file could be burnt to a CD.
    >
    > Manually splitting files is a pain.
    >
    > Also, If one of the files dies then I need to be able to get the rest.
    > If I use the split command I would need to append all the TAR's into the
    > one file just to extract one file.
    >

    [...]

    Perhaps something like below would meet your needs:

    #to archive the /home tree with gzip xompression into parts
    #labled like TAR-PART-aa TAR-PART-ab ... in directory /huge
    #showing file names to indicate progress

    tar cvvzf - /home | split --bytes=640000000 - /huge/TAR-PART-

    #to restore

    cd / ; cat /huge/TAR-PART-* | tar xvvzf -

    personally I think the PITA part will be putting them on CDs.
    If you don't use a packet writing cd-r format, you'll have
    to do a mkisofs for each part before you burn it. Yet another
    step where you have to let the machine chunk thorugh each byte
    in the archive.

    possibly you can burn the part files without a iso9660 format,
    but reading it becomes a PITA because just reading the cdrom
    device file will get an error when you reach the end.


    maybe you could use a packet writing CD-R system, and use the
    tar's --multi-volume switch to split it up that way as it
    gets written. I don't have packetwriting installed in my
    kernel, so I do not know if it'd work or not. Sure would
    be nice to have a --multi-volume format instead of a huge
    ..tgz file split up.




  4. Re: Linux Backup script

    If one of the tar files dies then wont the rest of them be usless?

    also - What if I want to extract one file- would I have to join a number
    of split files that may add up to 40gig?

    Thanks anyway


    Creideiki wrote:
    > On Thu, 11 Sep 2003 08:39:47 +1000, Ideasman wrote:
    >
    >>Hi, I am using linux to backup files.
    >>
    >>I want a backup system that produces many 640 meg files from a huge dir.
    >>Then each file could be burnt to a CD.
    >>
    >>Manually splitting files is a pain.
    >>
    >>Also, If one of the files dies then I need to be able to get the rest.
    >>If I use the split command I would need to append all the TAR's into the
    >>one file just to extract one file.
    >>

    >
    > [...]
    >
    > Perhaps something like below would meet your needs:
    >
    > #to archive the /home tree with gzip xompression into parts
    > #labled like TAR-PART-aa TAR-PART-ab ... in directory /huge
    > #showing file names to indicate progress
    >
    > tar cvvzf - /home | split --bytes=640000000 - /huge/TAR-PART-
    >
    > #to restore
    >
    > cd / ; cat /huge/TAR-PART-* | tar xvvzf -
    >
    > personally I think the PITA part will be putting them on CDs.
    > If you don't use a packet writing cd-r format, you'll have
    > to do a mkisofs for each part before you burn it. Yet another
    > step where you have to let the machine chunk thorugh each byte
    > in the archive.
    >
    > possibly you can burn the part files without a iso9660 format,
    > but reading it becomes a PITA because just reading the cdrom
    > device file will get an error when you reach the end.
    >
    >
    > maybe you could use a packet writing CD-R system, and use the
    > tar's --multi-volume switch to split it up that way as it
    > gets written. I don't have packetwriting installed in my
    > kernel, so I do not know if it'd work or not. Sure would
    > be nice to have a --multi-volume format instead of a huge
    > .tgz file split up.
    >
    >
    >



  5. Re: Linux Backup script

    On Thu, 11 Sep 2003 23:07:53 +1000, Ideasman wrote:
    > If one of the tar files dies then wont the rest of them be usless?
    >
    > also - What if I want to extract one file- would I have to join a number
    > of split files that may add up to 40gig?
    >
    > Thanks anyway
    >


    Well, that is the drawback of your original concept of splitting
    a tar file. But at least by using pipes, you wouldn't have to join
    all the parts into one file *before* extracting it. You still would
    have to let all of the preceding parts get piped through tar until you
    got the file though. You basically have to do the same if you had
    a straight multi-volume tar too.

    Your concept of optimal tgz files is nice, but I'm not sure you
    could do it in one pass. You'd probably have to tar files one
    file at a time until it exceeded the cd-r limit, then redo it
    to make a valid tar file with every file up to the point the size
    was OK, then start the next tar file at the point you left off.

    I'm thinking perhaps you could use a named pipe (as in mkfifo) with the
    --files-from option of tar. That lets you specify a list of files
    to archive in a file. The named pipe might be able to let that be
    controlled by another program.

    What I'm think of would require a list of all the files to be archived.
    That could be made manually, of via find, or by saving the stderr
    output of a "tar cvf /dev/null ... " command.

    Then a script (I'd use perl to do it) would start two copies of
    tar. Each tar process would be set up with its own named pipe
    given as input to the --files-from option. The script would then
    hand one name to the first tar process. It would process it and
    send the data to the output. This output would be counted to
    determine if it'd exceed the desired part size. If so, it'd signal
    the script who would tell the second tar to add that file.

    Buffering issues might make this unworkable. Such as tar doing no
    work until it had read an end-of-file on the --files-from file.

    At aleast something like the above should be able to make an
    automated script.

    fun fun fun

    well, there does seem to be a "dar" program designed specifically
    for archive to disks, instead of tapes like tar is. Whether that
    keeps you from having to load N cd-r's sequentually to get at
    a file in the middle of an archive, I don't know.

    http://sourceforge.net/projects/dar/


    /creideiki/



    >
    > Creideiki wrote:
    >> On Thu, 11 Sep 2003 08:39:47 +1000, Ideasman wrote:
    >>
    >>>Hi, I am using linux to backup files.
    >>>
    >>>I want a backup system that produces many 640 meg files from a huge dir.
    >>>Then each file could be burnt to a CD.
    >>>
    >>>Manually splitting files is a pain.
    >>>
    >>>Also, If one of the files dies then I need to be able to get the rest.
    >>>If I use the split command I would need to append all the TAR's into the
    >>>one file just to extract one file.
    >>>

    >>
    >> [...]
    >>
    >> Perhaps something like below would meet your needs:
    >>
    >> #to archive the /home tree with gzip xompression into parts
    >> #labled like TAR-PART-aa TAR-PART-ab ... in directory /huge
    >> #showing file names to indicate progress
    >>
    >> tar cvvzf - /home | split --bytes=640000000 - /huge/TAR-PART-
    >>
    >> #to restore
    >>
    >> cd / ; cat /huge/TAR-PART-* | tar xvvzf -
    >>
    >> personally I think the PITA part will be putting them on CDs.

    [...]

+ Reply to Thread