wget guru needed - Unix

This is a discussion on wget guru needed - Unix ; GoDaddy doesn't allow the use of ssh on shared hosted accounts, so I'm backing-up my small website using wget with the following options: "wget --recursive --wait=3 --tries=5 --mirror --output-file=wget.log --passive-ftp --page-requisites ftp://uname as...@ftp.mydomain.com" I have three questions: 1. The three ...

+ Reply to Thread
Results 1 to 5 of 5

Thread: wget guru needed

  1. wget guru needed

    GoDaddy doesn't allow the use of ssh on shared hosted accounts, so I'm
    backing-up my small website using wget with the following options:

    "wget --recursive --wait=3 --tries=5 --mirror --output-file=wget.log
    --passive-ftp --page-requisites ftp://unameas...@ftp.mydomain.com"

    I have three questions:

    1. The three second delay between retrievals makes this very slow -
    2.5 hrs. to backup 17 MB! Could I reduce 'wait' to 1 or 2 seconds
    without being a jerk?

    2. Do I have the correct options to give reliable backups of my website?

    3. Are any of these options unnecessary?


    droid
    --
    Anybody out there who knows how to connect a Wyse terminal keyboard to a
    PC _without_ the terminal in between? Current keyboard lses chracters
    evr so oftn............. Getng wrs al te im. Hp!

  2. Re: wget guru needed

    Jim Showalter writes:

    > GoDaddy doesn't allow the use of ssh on shared hosted accounts, so I'm
    > backing-up my small website using wget with the following options:
    >
    > "wget --recursive --wait=3 --tries=5 --mirror --output-file=wget.log
    > --passive-ftp --page-requisites ftp://unameas...@ftp.mydomain.com"
    >
    > I have three questions:
    >
    > 1. The three second delay between retrievals makes this very slow -
    > 2.5 hrs. to backup 17 MB! Could I reduce 'wait' to 1 or 2 seconds
    > without being a jerk?
    >
    > 2. Do I have the correct options to give reliable backups of my website?
    >
    > 3. Are any of these options unnecessary?



    The underlying problem with doing it this was is that you're using a
    command line http tool to access an ftp URL. wget spiders web pages
    using links. you might get some very interesting results depending on
    how the links are constructed in your files (absolute vs relative),
    and it's not clear to me whether when scripting files are invoked by
    the web server whether they'd even be grabbed by your command above.

    It's suggest abandoning wget for this task.

    If you would like commandline (and there are many good reasons to like it), look into rsync.

    mkdir mysitebackup
    cd mysitebackup
    rsync -P -v -r ftp.site.com::/ .


    Otherwise a proper GUI ftp client for whatever platform you are using
    would also be good. Obiously, command line ftp isn't terribly
    feasible for such a thing, but there are a number of good gui ftp
    clients out there that have an understanding of pulling down entire
    directory structures. For windows, winscp is decent.



    --
    Todd H.
    http://www.toddh.net/

  3. Re: wget guru needed

    Todd H. wrote:
    >
    > [.....]
    >
    > If you would like commandline (and there are many good reasons to like it), look into rsync.
    >
    > mkdir mysitebackup
    > cd mysitebackup
    > rsync -P -v -r ftp.site.com::/ .
    >


    I can't get rsync to work. Using your example after substituting
    "ftp.site.com::/ ." with "ftp.myhostname.com::/ .", I get:

    ERROR: The remote path must start with a module name not a /
    rsync error: error starting client-server protocol (code 5) at
    main.c(1308) [receiver=2.6.8]

    This is with the rsync daemon running. I get a different error when it
    isn't. And I can't rsync from ssh - GoDaddy blocks it.

    I'm trying to understand the rsync man page and tried various other
    command lines but get various other errors.


    droid
    --
    Whom computers would destroy, they must first drive mad.

  4. Re: wget guru needed

    Jim Showalter writes:

    > Todd H. wrote:
    > >
    > > [.....]
    > >
    > > If you would like commandline (and there are many good reasons to like it), look into rsync.
    > >
    > > mkdir mysitebackup
    > > cd mysitebackup
    > > rsync -P -v -r ftp.site.com::/ .
    > >

    >
    > I can't get rsync to work. Using your example after substituting
    > "ftp.site.com::/ ." with "ftp.myhostname.com::/ .", I get:
    >
    > ERROR: The remote path must start with a module name not a /
    > rsync error: error starting client-server protocol (code 5) at
    > main.c(1308) [receiver=2.6.8]
    >
    > This is with the rsync daemon running. I get a different error when it
    > isn't. And I can't rsync from ssh - GoDaddy blocks it.
    >
    > I'm trying to understand the rsync man page and tried various other
    > command lines but get various other errors.


    I may have barked up a lost cause with the rsync mention. I use it
    over ssh, and misread the man page thinking it might also be able to
    leverage an ftp connection somehow. It seems I was mistaken. Forgive
    me!

    I have however, come upon this tool which looks like it may do what
    you want:
    http://sourceforge.net/projects/ftpsync/

    I remain rather steadfast in that wget is not what ya want for this
    task.

    Best Regards,
    --
    Todd H.
    http://www.toddh.net/

  5. Re: wget guru needed

    On Thu, 08 Nov 2007 13:58:24 -0500, Jim Showalter wrote:

    > GoDaddy doesn't allow the use of ssh on shared hosted accounts, so I'm
    > backing-up my small website using wget with the following options:
    >
    > "wget --recursive --wait=3 --tries=5 --mirror --output-file=wget.log
    > --passive-ftp --page-requisites ftp://unameas...@ftp.mydomain.com"
    >


    from ncftp manpage
    Some of the cooler features include , downloading entire directory
    trees,

+ Reply to Thread