> I've found conflicting information about this.
>
> modperl: Is there a performance impact to breaking up a single 350k file
> (monolithic package) into multiple files (still the same package name)?
>
> I guess I'm asking: Are larger files inherently slower, even when
> pre-compiled? (Obviously, I'm not a computer science student)
>
> Here's the situation:
>
> Over time, what was a 60k package has grown to 350k, encompassing many
> subs
> that do more or less, 3 mutually exclusive operations. I want to remove
> 70% of the subs and put them into their own files for
> maintenace/organization reasons, but is there a performance benefit or hit
> by doing this? The main file will basically direct, requireing the
> additional files only when necessary. But under mod perl, I believe the
> require will only happen once (per process) anyway.
>
> Was:
>
> package foo
>
> [a,b,c] (~350k)
>
> Now:
>
> package foo
>
> [a] (~75k)
> require [b] (~115k) only when necessary
> require [c] (~160k) only when necessary
>
>
> Thank you for enlightenment or a reference link, in advance.
>
> W
>


Will,

The only performance you will gain from doing this is that it will take
Apache less amount of time to start each process. Also if your package foo
makes calls to packages containing b & c routines, it will incur the
performance hit of loading those packages the first time 'foo' requires
them. In your example, I think that the benefits would not be as large as
you might hope for.

In fact, sometimes it is better to preload/compile all the classes you
will be using when starting an apache process. This way, none of the users
will have to take that initial hit of compiling. To do this, you can just
use the 'use' statement to preload any classes/modules you plan on using
in your apache startup or config files. You can also take advantage of
running some initialization code during the preload. A great example would
be initializing DBI so it doesn't have to be done on the first request to
the server.

Rick