This is a discussion on Re: Dumping / Profiling Memory ? - modperl ; On 8/15/06, Perrin Harkins wrote: > On Tue, 2006-08-15 at 12:49 -0700, Arshavir Grigorian wrote: > > What would be a good tool for finding where this type of a problem > > problem is? > > Memory growth? You ...
On 8/15/06, Perrin Harkins
> On Tue, 2006-08-15 at 12:49 -0700, Arshavir Grigorian wrote:
> > What would be a good tool for finding where this type of a problem
> > problem is?
> Memory growth? You basically comment out code until it stops happening.
> > Also, considering that Perl does automagic garbage
> > collection, what sort of coding "style" would produce such a problem
> > (I guess circular references would be one, others?).
> There is a lot of information on what causes things to be
> shared/unshared and what coding techniques reduce your memory footprint
> in the Practical mod_perl book. In particular, here:
> People often either slurp a large file or read results from a large
> query (many databases load the entire result set into your process
> unless you tell them not to) and then their process stays big.
I guess I was more interested in type of structures that continuously
increase the memory footprint with each request (memory leaks).
Barring a database table getting larger, etc, is there any reason why
the memory footprint should grow with each request (for the same page,
> Keep in mind, Perl's garbage collection does not return the memory that
> a variable used when it goes out of scope. If you load up "my $file"
> with 5MB of data, it will be held onto forever and not used for other
> variables, even in the same process. Most of the time this is what you
> want, since it improves performance by avoiding reallocating memory next
> time you use that variable, but you can return the memory by undef'ing
Interesting ... I thought once a variable went out of scope, its
memory would be available for other uses within the same Perl process.
So would you recommend undef-ing all variables that store large data chunks?