------=_Part_23808_25591386.1163393885347
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

> No offense, but I've never seen a 350k module that didn't need
> to be broken into SEVERAL other modules for
> readability/maintainability reasons.


Thats the plan. It's easy to visualize: This single file is an old CGI
script/application maintained over 4 years with lots of shared subs that do
everything from authentication, database access, web screen templates,
logging and error handling. It was ported to a mod_perl environment last
year. But, as a single file, it was easy to edit and work on as a single
developer with a text editor.

I'd only guess that a lot of folks have skeltons like this in their closet.

; )

On 11/12/06, Frank Wiles wrote:
>
> On Sat, 11 Nov 2006 23:01:30 -0800
> "Will Fould" wrote:
>
> > I've found conflicting information about this.
> >
> > modperl: Is there a performance impact to breaking up a single 350k
> > file (monolithic package) into multiple files (still the same package
> > name)?
> >
> > I guess I'm asking: Are larger files inherently slower, even when
> > pre-compiled? (Obviously, I'm not a computer science student)
> >
> > Here's the situation:
> >
> > Over time, what was a 60k package has grown to 350k, encompassing
> > many subs that do more or less, 3 mutually exclusive operations. I
> > want to remove 70% of the subs and put them into their own files for
> > maintenace/organization reasons, but is there a performance benefit
> > or hit by doing this? The main file will basically direct,
> > requireing the additional files only when necessary. But under mod
> > perl, I believe the require will only happen once (per process)
> > anyway.

>
> You won't see any real performance improvement, but it certainly
> would be an improvement to your code.
>
> No offense, but I've never seen a 350k module that didn't need
> to be broken into SEVERAL other modules for
> readability/maintainability reasons.
>
> I'm over generalizing here and not taking into account
> bloated/memory hogging modules, but files/packages/modules
> are cheap to use and create and should be broken up for reasons
> other than performance.
>
> I just looked over one of the largest mod_perl apps I help
> maintain, there are 67 modules and only one of them is over
> 20k in size.
>
> ---------------------------------
> Frank Wiles
> http://www.wiles.org
> ---------------------------------
>
>


------=_Part_23808_25591386.1163393885347
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

>   No offense, but I've never seen a 350k module that didn't need
>   to be broken into SEVERAL other modules for
>   readability/maintainability reasons.

Thats the plan.  It's easy to visualize:  This single file is an old CGI script/application maintained over 4 years with lots of shared subs that do everything from authentication, database access, web screen templates, logging and error handling. It was ported to a mod_perl environment last year.  But, as a single file, it was easy to edit and work on as a single developer with a text editor. 


I'd only guess that a lot of folks have skeltons like this in their closet.

; )

On 11/12/06, Frank Wiles <
frank@wiles.org> wrote:
On Sat, 11 Nov 2006 23:01:30 -0800
"Will Fould" <
willfould@gmail.com> wrote:

> I've found conflicting information about this.
>
> modperl: Is there a performance impact to breaking up a single 350k
> file (monolithic package) into multiple files (still the same package

> name)?
>
> I guess I'm asking: Are larger files inherently slower, even when
> pre-compiled?  (Obviously, I'm not a computer science student)
>
> Here's the situation:
>
> Over time, what was a 60k package has grown to 350k, encompassing

> many subs that do more or less,  3 mutually exclusive operations.   I
> want to remove 70% of the subs and put them into their own files for
> maintenace/organization reasons, but is there a performance benefit

> or hit by doing this?  The main file will basically direct,
> requireing the additional files only when necessary. But under mod
> perl, I believe the require will only happen once (per process)
> anyway.


    You won't see any real performance improvement, but it certainly
    would be an improvement to your code.

    No offense, but I've never seen a 350k module that didn't need
    to be broken into SEVERAL other modules for

    readability/maintainability reasons.

    I'm over generalizing here and not taking into account
    bloated/memory hogging modules, but files/packages/modules
    are cheap to use and create and should be broken up for reasons

    other than performance.

    I just looked over one of the largest mod_perl apps I help
    maintain, there are 67 modules and only one of them is over
    20k in size.

---------------------------------

   Frank Wiles <frank@wiles.org>
   http://www.wiles.org
---------------------------------




------=_Part_23808_25591386.1163393885347--