backing up a website - Unix

This is a discussion on backing up a website - Unix ; I'm trying to backup up my website using wget. wget -mirror -w 3 -p -P $HOME/MyWebBak ftp://username assword@ftp.mydomain.com This produces $HOME/MyWebBak/index.html very quickly - too quickly. When I point my browser to $HOME/MyWebBak/index.html, it appears to be a mirror immage ...

+ Reply to Thread
Results 1 to 6 of 6

Thread: backing up a website

  1. backing up a website

    I'm trying to backup up my website using wget.

    wget -mirror -w 3 -p -P $HOME/MyWebBak ftp://usernameassword@ftp.mydomain.com

    This produces $HOME/MyWebBak/index.html very quickly - too quickly.

    When I point my browser to $HOME/MyWebBak/index.html, it appears to be
    a mirror immage of my website filesystem, but the files have no
    content.

    Is wget the tool I should be using for a modest ( <5GB) website?

    If yes, what should my command line be?

    If no, what OSS software should I turn to (using OpenSUSE 10.2, KDE)?


  2. Re: backing up a website

    droid wrote:

    > Is wget the tool I should be using for a modest ( <5GB) website?


    Ssh access ("scp -r ...." or "ssh ${remote} tar ...") would certainly
    be preferable, but I know that not all web service providers accomodate
    Ssh access.

    If you're backing up by recursive ftp, I think you'll probably have
    better success with ncftp ("ncftpget", specifically).

    I hope that helps ...

    --
    ----------------------------------------------------------------------
    Sylvain Robitaille syl@alcor.concordia.ca

    Systems and Network analyst Concordia University
    Instructional & Information Technology Montreal, Quebec, Canada
    ----------------------------------------------------------------------

  3. Re: backing up a website

    On Nov 5, 12:29 pm, Sylvain Robitaille
    wrote:
    > droid wrote:
    > > Is wget the tool I should be using for a modest ( <5GB) website?

    >
    > Ssh access ("scp -r ...." or "ssh ${remote} tar ...") would certainly
    > be preferable, but I know that not all web service providers accomodate
    > Ssh access.
    >
    > If you're backing up by recursive ftp, I think you'll probably have
    > better success with ncftp ("ncftpget", specifically).
    >
    > I hope that helps ...
    >


    GoDaddy doesn't permit ssh on shared hosted accounts. Also, if I ftp
    in with Konqueror and try to copy a directory (I assume it uses
    mget) ,
    it stops with a "too many connections" error.

    I hadn't heard of ncftp, I'll have a look at it. Many sources advise
    wget,
    but I'm sure I'm not using it right.


  4. Re: backing up a website

    On Mon, 05 Nov 2007 17:45:13 -0000, droid wrote:
    >
    > GoDaddy doesn't permit ssh on shared hosted accounts. Also, if I ftp
    > in with Konqueror and try to copy a directory (I assume it uses
    > mget) ,
    > it stops with a "too many connections" error.


    I had that problem with dotster, switched over to dreamhost. If you
    want a discount I can send you a referral code.

    > I hadn't heard of ncftp, I'll have a look at it. Many sources advise
    > wget,
    > but I'm sure I'm not using it right.


    didn't read the whole thread but, did you try wget -r ?

    Dave


  5. Re: backing up a website

    In comp.unix.admin Dave Hinz :
    > On Mon, 05 Nov 2007 17:45:13 -0000, droid wrote:


    >> GoDaddy doesn't permit ssh on shared hosted accounts. Also,
    >> if I ftp in with Konqueror and try to copy a directory (I
    >> assume it uses mget) , it stops with a "too many connections"
    >> error.


    > I had that problem with dotster, switched over to dreamhost. If you
    > want a discount I can send you a referral code.


    >> I hadn't heard of ncftp, I'll have a look at it. Many
    >> sources advise wget, but I'm sure I'm not using it right.


    > didn't read the whole thread but, did you try wget -r ?


    w3mir and lftp also have excellent http mirror capabilities.

    --
    Michael Heiming (X-PGP-Sig > GPG-Key ID: EDD27B94)
    mail: echo zvpunry@urvzvat.qr | perl -pe 'y/a-z/n-za-m/'
    #bofh excuse 432: Borg nanites have infested the server

  6. Re: backing up a website

    Dave Hinz wrote:
    >
    > I had that problem with dotster, switched over to dreamhost. If you
    > want a discount I can send you a referral code.
    >


    Please do. I checked them out at http://www.dreamhost.com/hosting.html
    and they sound awesome!

    >
    > didn't read the whole thread but, did you try wget -r ?
    >


    Yes, it seems to be working fine with -r. But everyone is advising
    against wget, recommending ncft (or ncftpget), w3mir, lftp and FTPsync
    instead.


    droid
    --
    Command, n.:
    Statement presented by a human and accepted by a computer in
    such a manner as to make the human feel as if he is in control.

+ Reply to Thread