Help with perl script run from crontab - BSD

This is a discussion on Help with perl script run from crontab - BSD ; I have a script, which is a bastardized version of Diskhog, which checks the directory sizes on our email server. This runs fine from the command line, but when I schedule and run it from cron using: */3 * * ...

+ Reply to Thread
Results 1 to 13 of 13

Thread: Help with perl script run from crontab

  1. Help with perl script run from crontab

    I have a script, which is a bastardized version of Diskhog, which checks
    the directory sizes on our email server. This runs fine from the
    command line, but when I schedule and run it from cron using:

    */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    the output from the array is doubled in the resulting webpage (I.e.
    someone using 1 mb of disk space shows on the page created they are
    using 2mb of space).


    ps agx |grep Email
    shows the following two processes:

    33471 ?? Ss 0:00.00 /bin/sh -c
    /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    33473 ?? S 0:00.05 /usr/bin/perl -w
    /usr/local/www/data/size/EmailStat.sh (perl5.8.7)


    I have tried calling the script from a two line shell script, it ran
    three processes. Tried various incantations in crontab, still double
    output.

    Help? As I mentioned, running from the command line it works fine (and
    only shows one process running when ps is used to check). Other perl
    scripts run fine from crontab. Gimme a clue?

    --
    Tweak

  2. Re: Help with perl script run from crontab

    On Wed, 16 Jan 2008 12:45:43 -0500
    Tweak wrote:

    > I have a script, which is a bastardized version of Diskhog, which checks
    > the directory sizes on our email server. This runs fine from the
    > command line, but when I schedule and run it from cron using:
    >
    > */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1


    Every three minutes,

    > 33471 ?? Ss 0:00.00 /bin/sh -c
    > /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    > 33473 ?? S 0:00.05 /usr/bin/perl -w
    > /usr/local/www/data/size/EmailStat.sh (perl5.8.7)


    How long does it take to run ? Is there any protection against
    multiple copies being started ?

    > Help? As I mentioned, running from the command line it works fine (and
    > only shows one process running when ps is used to check). Other perl
    > scripts run fine from crontab. Gimme a clue?


    My best guess is that it takes longer than three minutes to run and
    a second run is starting before the first has finished.

    --
    C:>WIN | Directable Mirror Arrays
    The computer obeys and wins. | A better way to focus the sun
    You lose and Bill collects. | licences available see
    | http://www.sohara.org/

  3. Re: Help with perl script run from crontab

    Steve O'Hara-Smith wrote:

    > On Wed, 16 Jan 2008 12:45:43 -0500
    > Tweak wrote:
    >
    >> I have a script, which is a bastardized version of Diskhog, which checks
    >> the directory sizes on our email server. This runs fine from the
    >> command line, but when I schedule and run it from cron using:
    >>
    >> */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    >
    > Every three minutes,
    >
    >> 33471 ?? Ss 0:00.00 /bin/sh -c
    >> /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    >> 33473 ?? S 0:00.05 /usr/bin/perl -w
    >> /usr/local/www/data/size/EmailStat.sh (perl5.8.7)

    >
    > How long does it take to run ? Is there any protection against
    > multiple copies being started ?
    >
    >> Help? As I mentioned, running from the command line it works fine (and
    >> only shows one process running when ps is used to check). Other perl
    >> scripts run fine from crontab. Gimme a clue?

    >
    > My best guess is that it takes longer than three minutes to run and
    > a second run is starting before the first has finished.
    >


    I'm also wondering if it is CGI embedded in a web page, and being run again
    when the web page is called.

    -Jason

  4. Re: Help with perl script run from crontab

    On Thu, 17 Jan 2008 00:30:49 +0000, Steve O'Hara-Smith wrote:
    > On Wed, 16 Jan 2008 12:45:43 -0500
    > Tweak wrote:
    >
    >> I have a script, which is a bastardized version of Diskhog, which checks
    >> the directory sizes on our email server. This runs fine from the
    >> command line, but when I schedule and run it from cron using:
    >>
    >> */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    >
    > Every three minutes,
    >
    >> 33471 ?? Ss 0:00.00 /bin/sh -c /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    >> 33473 ?? S 0:00.05 /usr/bin/perl -w /usr/local/www/data/size/EmailStat.sh (perl5.8.7)

    >
    > How long does it take to run ? Is there any protection against
    > multiple copies being started ?
    >
    >> Help? As I mentioned, running from the command line it works fine (and
    >> only shows one process running when ps is used to check). Other perl
    >> scripts run fine from crontab. Gimme a clue?

    >
    > My best guess is that it takes longer than three minutes to run and
    > a second run is starting before the first has finished.


    The PIDs are too close. It would be a bit surprising is a busy mail
    server created just 2 new processes in 3 minutes.

    The commands in a crontab file are executed by sh(1), as the manpage of
    crontab(5) says:

    The entire command portion of the line, up to a newline or %
    character, will be executed by /bin/sh or by the shell
    specified in the SHELL variable of the cronfile.

    I would probably write something like this in the crontab:

    */3 * * * * perl -Tw /usr/local/www/data/size/EmailStat.pl

    This will still run through sh(1), but it will use the correct
    interpreter for the `EmailStat.pl' script.


  5. Re: Help with perl script run from crontab

    Tweak wrote:
    > I have a script, which is a bastardized version of Diskhog, which checks
    > the directory sizes on our email server. This runs fine from the
    > command line, but when I schedule and run it from cron using:
    >
    > */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1
    >
    > the output from the array is doubled in the resulting webpage (I.e.
    > someone using 1 mb of disk space shows on the page created they are
    > using 2mb of space).
    >
    >
    > ps agx |grep Email
    > shows the following two processes:
    >
    > 33471 ?? Ss 0:00.00 /bin/sh -c
    > /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    > 33473 ?? S 0:00.05 /usr/bin/perl -w
    > /usr/local/www/data/size/EmailStat.sh (perl5.8.7)
    >
    >
    > I have tried calling the script from a two line shell script, it ran
    > three processes. Tried various incantations in crontab, still double
    > output.
    >
    > Help? As I mentioned, running from the command line it works fine (and
    > only shows one process running when ps is used to check). Other perl
    > scripts run fine from crontab. Gimme a clue?
    >

    I would check the value of BLOCKSIZE in the environment of a cron job

    Henri

  6. Re: Help with perl script run from crontab

    In article <20080117003049.83a42f4b.steveo@eircom.net>,
    steveo@eircom.net says...
    > On Wed, 16 Jan 2008 12:45:43 -0500
    > Tweak wrote:
    >
    > > I have a script, which is a bastardized version of Diskhog, which checks
    > > the directory sizes on our email server. This runs fine from the
    > > command line, but when I schedule and run it from cron using:
    > >
    > > */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    >
    > Every three minutes,
    >
    > > 33471 ?? Ss 0:00.00 /bin/sh -c
    > > /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    > > 33473 ?? S 0:00.05 /usr/bin/perl -w
    > > /usr/local/www/data/size/EmailStat.sh (perl5.8.7)

    >
    > How long does it take to run ? Is there any protection against
    > multiple copies being started ?
    >
    > > Help? As I mentioned, running from the command line it works fine (and
    > > only shows one process running when ps is used to check). Other perl
    > > scripts run fine from crontab. Gimme a clue?

    >
    > My best guess is that it takes longer than three minutes to run and
    > a second run is starting before the first has finished.
    >
    >

    Under 1 minute. First thing I thought of. Thanks.
    --
    Tweak

  7. Re: Help with perl script run from crontab

    In article ,
    j_bourne_treadstone@hotmail.com says...
    > Steve O'Hara-Smith wrote:
    >
    > > On Wed, 16 Jan 2008 12:45:43 -0500
    > > Tweak wrote:
    > >
    > >> I have a script, which is a bastardized version of Diskhog, which checks
    > >> the directory sizes on our email server. This runs fine from the
    > >> command line, but when I schedule and run it from cron using:
    > >>
    > >> */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    > >
    > > Every three minutes,
    > >
    > >> 33471 ?? Ss 0:00.00 /bin/sh -c
    > >> /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    > >> 33473 ?? S 0:00.05 /usr/bin/perl -w
    > >> /usr/local/www/data/size/EmailStat.sh (perl5.8.7)

    > >
    > > How long does it take to run ? Is there any protection against
    > > multiple copies being started ?
    > >
    > >> Help? As I mentioned, running from the command line it works fine (and
    > >> only shows one process running when ps is used to check). Other perl
    > >> scripts run fine from crontab. Gimme a clue?

    > >
    > > My best guess is that it takes longer than three minutes to run and
    > > a second run is starting before the first has finished.
    > >

    >
    > I'm also wondering if it is CGI embedded in a web page, and being run again
    > when the web page is called.
    >
    > -Jason
    >

    It basically does a "du" on a bunch of email user directories, dumps it
    into an array and then makes a webpage with the data. I originally set
    it to update every hour, but changed to 3 minutes for testing after I
    discovered my outputs were double.

    Can't use quotas here, so I'm grasping at straws in some sort of effort
    to get people to manage their email. Exercise in futility, I know.
    --
    Tweak

  8. Re: Help with perl script run from crontab

    In article <87wsq8n9uj.fsf@kobe.laptop>, keramida@ceid.upatras.gr
    says...
    > On Thu, 17 Jan 2008 00:30:49 +0000, Steve O'Hara-Smith wrote:
    > > On Wed, 16 Jan 2008 12:45:43 -0500
    > > Tweak wrote:
    > >
    > >> I have a script, which is a bastardized version of Diskhog, which checks
    > >> the directory sizes on our email server. This runs fine from the
    > >> command line, but when I schedule and run it from cron using:
    > >>
    > >> */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    > >
    > > Every three minutes,
    > >
    > >> 33471 ?? Ss 0:00.00 /bin/sh -c /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    > >> 33473 ?? S 0:00.05 /usr/bin/perl -w /usr/local/www/data/size/EmailStat.sh (perl5.8.7)

    > >
    > > How long does it take to run ? Is there any protection against
    > > multiple copies being started ?
    > >
    > >> Help? As I mentioned, running from the command line it works fine (and
    > >> only shows one process running when ps is used to check). Other perl
    > >> scripts run fine from crontab. Gimme a clue?

    > >
    > > My best guess is that it takes longer than three minutes to run and
    > > a second run is starting before the first has finished.

    >
    > The PIDs are too close. It would be a bit surprising is a busy mail
    > server created just 2 new processes in 3 minutes.
    >
    > The commands in a crontab file are executed by sh(1), as the manpage of
    > crontab(5) says:
    >
    > The entire command portion of the line, up to a newline or %
    > character, will be executed by /bin/sh or by the shell
    > specified in the SHELL variable of the cronfile.
    >
    > I would probably write something like this in the crontab:
    >
    > */3 * * * * perl -Tw /usr/local/www/data/size/EmailStat.pl
    >
    > This will still run through sh(1), but it will use the correct
    > interpreter for the `EmailStat.pl' script.
    >
    >

    No workie.

    I set up a test user and tried different environments, which did give me
    some more info. I believe this line:

    system "du -s * | grep -vE '(list of users/files to ignore)' >>
    $TmpFile";

    is the problem, as it is running twice. If I change it to:

    system "du -s * >> $TmpFile";

    the program runs properly. Problem is du has no -X functionality on this
    machine (FreeBSD 4.10), and there is a list of system directories I want
    to exclude from the report.

    I'm afraid I have run out of scripting talent. Can I help someone with
    a routing problem? ;-)


    --
    Tweak

  9. Re: Help with perl script run from crontab

    Tweak wrote:
    > In article <87wsq8n9uj.fsf@kobe.laptop>, keramida@ceid.upatras.gr
    > says...
    >> On Thu, 17 Jan 2008 00:30:49 +0000, Steve O'Hara-Smith wrote:
    >>> On Wed, 16 Jan 2008 12:45:43 -0500
    >>> Tweak wrote:
    >>>
    >>>> I have a script, which is a bastardized version of Diskhog, which checks
    >>>> the directory sizes on our email server. This runs fine from the
    >>>> command line, but when I schedule and run it from cron using:
    >>>>
    >>>> */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1
    >>> Every three minutes,
    >>>
    >>>> 33471 ?? Ss 0:00.00 /bin/sh -c /usr/local/www/data/size/EmailStat.sh >/dev/null 2>&1
    >>>> 33473 ?? S 0:00.05 /usr/bin/perl -w /usr/local/www/data/size/EmailStat.sh (perl5.8.7)
    >>> How long does it take to run ? Is there any protection against
    >>> multiple copies being started ?
    >>>
    >>>> Help? As I mentioned, running from the command line it works fine (and
    >>>> only shows one process running when ps is used to check). Other perl
    >>>> scripts run fine from crontab. Gimme a clue?
    >>> My best guess is that it takes longer than three minutes to run and
    >>> a second run is starting before the first has finished.

    >> The PIDs are too close. It would be a bit surprising is a busy mail
    >> server created just 2 new processes in 3 minutes.
    >>
    >> The commands in a crontab file are executed by sh(1), as the manpage of
    >> crontab(5) says:
    >>
    >> The entire command portion of the line, up to a newline or %
    >> character, will be executed by /bin/sh or by the shell
    >> specified in the SHELL variable of the cronfile.
    >>
    >> I would probably write something like this in the crontab:
    >>
    >> */3 * * * * perl -Tw /usr/local/www/data/size/EmailStat.pl
    >>
    >> This will still run through sh(1), but it will use the correct
    >> interpreter for the `EmailStat.pl' script.
    >>
    >>

    > No workie.
    >
    > I set up a test user and tried different environments, which did give me
    > some more info. I believe this line:
    >
    > system "du -s * | grep -vE '(list of users/files to ignore)' >>
    > $TmpFile";


    man du

    see the line about BLOCKSIZE

    I test it, under cron BLOCKSIZE is undefined, under login shell BLOCKSIZE=K

    Henri
    >
    > is the problem, as it is running twice. If I change it to:
    >
    > system "du -s * >> $TmpFile";
    >
    > the program runs properly. Problem is du has no -X functionality on this
    > machine (FreeBSD 4.10), and there is a list of system directories I want
    > to exclude from the report.
    >
    > I'm afraid I have run out of scripting talent. Can I help someone with
    > a routing problem? ;-)
    >
    >


  10. Re: Help with perl script run from crontab

    In article , hlh@restart.be says...

    >
    > see the line about BLOCKSIZE
    >
    > I test it, under cron BLOCKSIZE is undefined, under login shell BLOCKSIZE=K
    >
    > Henri
    > >

    This was the solution, as setting BLOCKSIZE in the test users crontab
    fixed it. Thank you very much, and thank you to everyone else as well.

    --
    Tweak

  11. Re: Help with perl script run from crontab

    On Thu, 17 Jan 2008 08:55:31 -0500, Tweak wrote:
    > In article <87wsq8n9uj.fsf@kobe.laptop>, keramida@ceid.upatras.gr says...
    >> The commands in a crontab file are executed by sh(1), as the manpage of
    >> crontab(5) says:
    >>
    >> The entire command portion of the line, up to a newline or %
    >> character, will be executed by /bin/sh or by the shell
    >> specified in the SHELL variable of the cronfile.
    >>
    >> I would probably write something like this in the crontab:
    >>
    >> */3 * * * * perl -Tw /usr/local/www/data/size/EmailStat.pl
    >>
    >> This will still run through sh(1), but it will use the correct
    >> interpreter for the `EmailStat.pl' script.

    >
    > No workie.
    >
    > I set up a test user and tried different environments, which did give me
    > some more info. I believe this line:
    >
    > system "du -s * | grep -vE '(list of users/files to ignore)' >> $TmpFile";
    >
    > is the problem, as it is running twice.


    The script shouldn't run twice if you use system(), but you *may* see a
    /bin/sh shell in the process listing when it runs because that's what
    system() is supposed to do, i.e. fork a shell and run the command _in_
    that shell child.

    > If I change it to:
    >
    > system "du -s * >> $TmpFile";
    >
    > the program runs properly. Problem is du has no -X functionality on
    > this machine (FreeBSD 4.10), and there is a list of system directories
    > I want to exclude from the report.


    Show us the rest of the script then. What you have described so far
    (and the process listing quoted earlier) do not show the same script
    running multiple times. It's possible that:

    a) You are misinterpreting the ps(1) output

    b) The script has a bug somewhere.


  12. Re: Help with perl script run from crontab

    On Thu, 17 Jan 2008 19:23:07 +0200, Giorgos Keramidas wrote:
    >On Thu, 17 Jan 2008 08:55:31 -0500, Tweak wrote:
    >> If I change it to:
    >>
    >> system "du -s * >> $TmpFile";
    >>
    >> the program runs properly. Problem is du has no -X functionality on
    >> this machine (FreeBSD 4.10), and there is a list of system
    >> directories I want to exclude from the report.

    >
    > Show us the rest of the script then. What you have described so far
    > (and the process listing quoted earlier) do not show the same script
    > running multiple times. It's possible that:
    >
    > a) You are misinterpreting the ps(1) output
    >
    > b) The script has a bug somewhere.


    Nevermind. I saw the reply which said BLOCKSIZE=1024 helped you fix
    this. Then I realized that

    This runs fine from the command line, but when I schedule and run it
    from cron using:

    */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1

    the output from the array is doubled in the resulting webpage
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^

    this part, where doubled output is mentioned does not mean that you get
    two runs of the script, but twice the disk usage size. That's normal,
    because the default BLOCKSIZE is 512 bytes, and this tends to confuse
    people when they expect to see kilo-bytes reported


  13. Re: Help with perl script run from crontab

    In article <8763xs2f5r.fsf@kobe.laptop>, keramida@ceid.upatras.gr
    says...
    > On Thu, 17 Jan 2008 19:23:07 +0200, Giorgos Keramidas wrote:
    > >On Thu, 17 Jan 2008 08:55:31 -0500, Tweak wrote:
    > >> If I change it to:
    > >>
    > >> system "du -s * >> $TmpFile";
    > >>
    > >> the program runs properly. Problem is du has no -X functionality on
    > >> this machine (FreeBSD 4.10), and there is a list of system
    > >> directories I want to exclude from the report.

    > >
    > > Show us the rest of the script then. What you have described so far
    > > (and the process listing quoted earlier) do not show the same script
    > > running multiple times. It's possible that:
    > >
    > > a) You are misinterpreting the ps(1) output
    > >
    > > b) The script has a bug somewhere.

    >
    > Nevermind. I saw the reply which said BLOCKSIZE=1024 helped you fix
    > this. Then I realized that
    >
    > This runs fine from the command line, but when I schedule and run it
    > from cron using:
    >
    > */3 * * * * /usr/local/www/data/size/EmailStat.pl >/dev/null 2>&1
    >
    > the output from the array is doubled in the resulting webpage
    > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^
    >
    > this part, where doubled output is mentioned does not mean that you get
    > two runs of the script, but twice the disk usage size. That's normal,
    > because the default BLOCKSIZE is 512 bytes, and this tends to confuse
    > people when they expect to see kilo-bytes reported
    >
    >

    It certainly got me. :-) The script showing up twice with ps pushed me
    even farther in the wrong direction.
    --
    Tweak

+ Reply to Thread