backing up a website - Unix

This is a discussion on backing up a website - Unix ; I'm trying to backup up my website using wget. wget -mirror -w 3 -p -P $HOME/MyWebBak ftp://username assword@ftp.mydomain.com This produces $HOME/MyWebBak/index.html very quickly - too quickly. When I point my browser to $HOME/MyWebBak/index.html, it appears to be a mirror immage ...

+ Reply to Thread
Results 1 to 2 of 2

Thread: backing up a website

  1. backing up a website

    I'm trying to backup up my website using wget.

    wget -mirror -w 3 -p -P $HOME/MyWebBak ftp://usernameassword@ftp.mydomain.com

    This produces $HOME/MyWebBak/index.html very quickly - too quickly.

    When I point my browser to $HOME/MyWebBak/index.html, it appears to be
    a mirror immage of my website filesystem, but the files have no
    content.

    Is wget the tool I should be using for a modest ( <5GB) website?

    If yes, what should my command line be?

    If no, what OSS software should I turn to (using OpenSUSE 10.2, KDE)?


  2. Re: backing up a website

    On Nov 5, 11:19 am, droid wrote:
    > I'm trying to backup up my website using wget.
    >
    > [....]
    >
    > ... what should my command line be?
    >
    > [....]
    >


    I've been experimenting with different options, and the following
    command line seems to give good results: "wget --mirror -w3 -pr --
    output-file=wget.log --tries=5 --passive-ftp ftp://unameasswd@ftp.mydomain.com"

    Because of the 3 sec. delay before each file [right?] it is slow. It
    takes nearly 2.5 hrs. to backup 27MB, but this is not unacceptable.
    Or should I use "-w2"?

    Anyway, do I have the correct options to give reliable backups of my
    website? Are any unnecessary?


+ Reply to Thread