On Sat, 11 Nov 2006 23:01:30 -0800
"Will Fould" wrote:

> I've found conflicting information about this.
> modperl: Is there a performance impact to breaking up a single 350k
> file (monolithic package) into multiple files (still the same package
> name)?
> I guess I'm asking: Are larger files inherently slower, even when
> pre-compiled? (Obviously, I'm not a computer science student)
> Here's the situation:
> Over time, what was a 60k package has grown to 350k, encompassing
> many subs that do more or less, 3 mutually exclusive operations. I
> want to remove 70% of the subs and put them into their own files for
> maintenace/organization reasons, but is there a performance benefit
> or hit by doing this? The main file will basically direct,
> requireing the additional files only when necessary. But under mod
> perl, I believe the require will only happen once (per process)
> anyway.

You won't see any real performance improvement, but it certainly
would be an improvement to your code.

No offense, but I've never seen a 350k module that didn't need
to be broken into SEVERAL other modules for
readability/maintainability reasons.

I'm over generalizing here and not taking into account
bloated/memory hogging modules, but files/packages/modules
are cheap to use and create and should be broken up for reasons
other than performance.

I just looked over one of the largest mod_perl apps I help
maintain, there are 67 modules and only one of them is over
20k in size.

Frank Wiles