Diagnosing memory usage - modperl

This is a discussion on Diagnosing memory usage - modperl ; I've inherited an existing Apache+mod_perl 1.3.x server. I am not very experienced with Apache nor mod_perl, so I have to pick up things as I go along. Recently I built a new version of Apache (1.3.41) with static mod_perl 1.30, ...

+ Reply to Thread
Results 1 to 4 of 4

Thread: Diagnosing memory usage

  1. Diagnosing memory usage

    I've inherited an existing Apache+mod_perl 1.3.x server. I am not very
    experienced with Apache nor mod_perl, so I have to pick up things as I
    go along.

    Recently I built a new version of Apache (1.3.41) with static mod_perl
    1.30, and it seems to work. The problem is that every few days or so,
    the server apparently runs out of memory, grinding to a halt and
    necessitating a hard reset.

    I suspect mod_perl is the primary memory user here, since most of the
    pages we serve are Perl scripts. But I don't know how to go about
    diagnosing the problem, especially since the server gets so bogged
    down when it happens that I can't access it to get info on running
    processes, memory usage, etc. I have noticed that the memory usage of
    each httpd process seems to grow over time, but it's usually very slow
    growth, and I can't tell if that's really a leak or just normal
    behavior.

    Workarounds would be helpful, but naturally I'd prefer to eliminate
    the cause of the problem. I've been looking through the documentation,
    but haven't made much progress so far. How can I get to the bottom of
    this?

    -Michael


  2. Re: Diagnosing memory usage


    Michael - depends on the OS - but you could look at the Apache::SizeLimit
    code which allows kills processes when the memory per process gets large
    works well for the system we use at work...

    If on a unix/linux based system "top" is your friend as it will indicate
    the memory usage per process


    On Sun, 15 Jun 2008, Michael Gardner wrote:

    > I've inherited an existing Apache+mod_perl 1.3.x server. I am not very
    > experienced with Apache nor mod_perl, so I have to pick up things as I go
    > along.
    >
    > Recently I built a new version of Apache (1.3.41) with static mod_perl 1.30,
    > and it seems to work. The problem is that every few days or so, the server
    > apparently runs out of memory, grinding to a halt and necessitating a hard
    > reset.
    >
    > I suspect mod_perl is the primary memory user here, since most of the pages
    > we serve are Perl scripts. But I don't know how to go about diagnosing the
    > problem, especially since the server gets so bogged down when it happens that
    > I can't access it to get info on running processes, memory usage, etc. I have
    > noticed that the memory usage of each httpd process seems to grow over time,
    > but it's usually very slow growth, and I can't tell if that's really a leak
    > or just normal behavior.
    >
    > Workarounds would be helpful, but naturally I'd prefer to eliminate the cause
    > of the problem. I've been looking through the documentation, but haven't made
    > much progress so far. How can I get to the bottom of this?
    >
    > -Michael



    --
    The Wellcome Trust Sanger Institute is operated by Genome Research
    Limited, a charity registered in England with number 1021457 and a
    company registered in England with number 2742969, whose registered
    office is 215 Euston Road, London, NW1 2BE.


  3. Re: Diagnosing memory usage




    --- On Sun, 6/15/08, James Smith wrote:

    > From: James Smith
    > Subject: Re: Diagnosing memory usage
    > To: "Michael Gardner"
    > Cc: modperl@perl.apache.org
    > Date: Sunday, June 15, 2008, 11:13 PM
    > Michael - depends on the OS - but you could look at the
    > Apache::SizeLimit
    > code which allows kills processes when the memory per
    > process gets large
    > works well for the system we use at work...
    >
    > If on a unix/linux based system "top" is your
    > friend as it will indicate
    > the memory usage per process
    >
    >
    > On Sun, 15 Jun 2008, Michael Gardner wrote:
    >
    > > I've inherited an existing Apache+mod_perl 1.3.x

    > server. I am not very
    > > experienced with Apache nor mod_perl, so I have to

    > pick up things as I go
    > > along.
    > >
    > > Recently I built a new version of Apache (1.3.41) with

    > static mod_perl 1.30,
    > > and it seems to work. The problem is that every few

    > days or so, the server
    > > apparently runs out of memory, grinding to a halt and

    > necessitating a hard
    > > reset.
    > >
    > > I suspect mod_perl is the primary memory user here,

    > since most of the pages
    > > we serve are Perl scripts. But I don't know how to

    > go about diagnosing the
    > > problem, especially since the server gets so bogged

    > down when it happens that
    > > I can't access it to get info on running

    > processes, memory usage, etc. I have
    > > noticed that the memory usage of each httpd process

    > seems to grow over time,
    > > but it's usually very slow growth, and I can't

    > tell if that's really a leak
    > > or just normal behavior.
    > >
    > > Workarounds would be helpful, but naturally I'd

    > prefer to eliminate the cause
    > > of the problem. I've been looking through the

    > documentation, but haven't made
    > > much progress so far. How can I get to the bottom of

    > this?
    > >
    > > -Michael

    >
    >
    > --
    > The Wellcome Trust Sanger Institute is operated by Genome
    > Research
    > Limited, a charity registered in England with number
    > 1021457 and a
    > company registered in England with number 2742969, whose
    > registered
    > office is 215 Euston Road, London, NW1 2BE.





    Sounds like the Perl code running under mod_perl (the scripts themselves) are the more likely culprit.

    Check out Devel::Cycle on CPAN ( http://search.cpan.org/perldoc?Devel::Cycle ) for more information on that.

    Best regards,
    John Drago











  4. Re: Diagnosing memory usage

    On Sun, 15 Jun 2008 21:09:23 -0500
    Michael Gardner wrote:

    > Workarounds would be helpful, but naturally I'd prefer to eliminate
    > the cause of the problem. I've been looking through the
    > documentation, but haven't made much progress so far. How can I get
    > to the bottom of this?


    Most likely there is some bad Perl in there that is slurping in
    large amounts of data or a possible memory leak.

    An fairly easy work around is to reduce MaxRequestsPerChild to
    a fairly low level say like 20 so that any children that grow
    large will be reaped more quickly.

    -------------------------------------------------------
    Frank Wiles, Revolution Systems, LLC.
    Personal : frank@wiles.org http://www.wiles.org
    Work : frank@revsys.com http://www.revsys.com


+ Reply to Thread