C Program Execution Problem - SCO

This is a discussion on C Program Execution Problem - SCO ; I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The program runs fine on 5.0.4 (I tried it on more than one system), but produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of ...

+ Reply to Thread
Results 1 to 6 of 6

Thread: C Program Execution Problem

  1. C Program Execution Problem

    I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    program runs fine on 5.0.4 (I tried it on more than one system), but
    produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    them) when it attempts to execute a subroutine which defines a very
    large (6MB) character array (the fault occurs on the c code line
    defining the array). Recompiling the code on 5.0.7 produces the same
    memory fault.

    The 5.0.4 development system is 5.1.0Ac and the 5.0.7 development
    system is 5.2.0Aa.

    Any ideas on what is causing the problem?

    --
    Richard Seeder

  2. Re: C Program Execution Problem

    On 31 Jan, 19:24, Richard Seeder wrote:
    > I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    > program runs fine on 5.0.4 (I tried it on more than one system), but
    > produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    > them) when it attempts to execute a subroutine which defines a very
    > large (6MB) character array (the fault occurs on the c code line
    > defining the array). Recompiling the code on 5.0.7 produces the same
    > memory fault.
    >
    > The 5.0.4 development system is 5.1.0Ac and the 5.0.7 development
    > system is 5.2.0Aa.
    >
    > Any ideas on what is causing the problem?


    Richard,

    There is not really enough information here to say what the problem
    is.

    If I were you I would try to isolate whether or not its the size of
    the
    array or the way the arry is being defined that is at fault here.

    JOhn


  3. Re: C Program Execution Problem

    On Fri, 1 Feb 2008 01:59:16 -0800 (PST), jboland@sco.com wrote:

    >On 31 Jan, 19:24, Richard Seeder wrote:
    >> I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    >> program runs fine on 5.0.4 (I tried it on more than one system), but
    >> produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    >> them) when it attempts to execute a subroutine which defines a very
    >> large (6MB) character array (the fault occurs on the c code line
    >> defining the array). Recompiling the code on 5.0.7 produces the same
    >> memory fault.
    >>
    >> The 5.0.4 development system is 5.1.0Ac and the 5.0.7 development
    >> system is 5.2.0Aa.
    >>
    >> Any ideas on what is causing the problem?

    >
    >Richard,
    >
    >There is not really enough information here to say what the problem
    >is.
    >
    >If I were you I would try to isolate whether or not its the size of
    >the
    >array or the way the arry is being defined that is at fault here.
    >
    >JOhn



    The array is defined as "char b[_A][_B][_C];" where "#define _A=432",
    "#define _B=56" and "#define _C=256". If I reduce A to, say, 100,
    then it does not produce the memory fault, so size seems to matter.
    But the question is, why does it work on 5.0.4 and not on later
    releases?

    --
    Richard Seeder

  4. Re: C Program Execution Problem

    On Feb 1, 5:23 am, Richard Seeder wrote:
    > On Fri, 1 Feb 2008 01:59:16 -0800 (PST), jbol...@sco.com wrote:
    > >On 31 Jan, 19:24, Richard Seeder wrote:
    > >> I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    > >> program runs fine on 5.0.4 (I tried it on more than one system), but
    > >> produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    > >> them) when it attempts to execute a subroutine which defines a very
    > >> large (6MB) character array (the fault occurs on the c code line
    > >> defining the array). Recompiling the code on 5.0.7 produces the same
    > >> memory fault.

    >
    > >> The 5.0.4 development system is 5.1.0Ac and the 5.0.7 development
    > >> system is 5.2.0Aa.

    >
    > >> Any ideas on what is causing the problem?

    >
    > >Richard,

    >
    > >There is not really enough information here to say what the problem
    > >is.

    >
    > >If I were you I would try to isolate whether or not its the size of
    > >the
    > >array or the way the arry is being defined that is at fault here.

    >
    > >JOhn

    >
    > The array is defined as "char b[_A][_B][_C];" where "#define _A=432",
    > "#define _B=56" and "#define _C=256". If I reduce A to, say, 100,
    > then it does not produce the memory fault, so size seems to matter.
    > But the question is, why does it work on 5.0.4 and not on later
    > releases?
    >
    > --
    > Richard Seeder


    Use malloc, do not allocate large arrays on the stack.

  5. Re: C Program Execution Problem

    On 2008-02-01, Richard Seeder wrote:

    > I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    > program runs fine on 5.0.4 (I tried it on more than one system), but
    > produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    > them) when it attempts to execute a subroutine which defines a very
    > large (6MB) character array (the fault occurs on the c code line
    > defining the array). Recompiling the code on 5.0.7 produces the same
    > memory fault.


    > The array is defined as "char b[_A][_B][_C];" where "#define _A=432",
    > "#define _B=56" and "#define _C=256". If I reduce A to, say, 100,
    > then it does not produce the memory fault, so size seems to matter.
    > But the question is, why does it work on 5.0.4 and not on later
    > releases?


    You are hitting the maximum stack size limit. You may be able to
    alter this using setrlimit() or ulimit from the calling shell.
    However you will need to have root privileges to do this. Another
    poster has already pointed out that allocating such large structures
    on the stack is in general a bad idea. Although there are
    circumstances where it may be appropriate for speed or to reduce
    memory fragmentation in general malloc()ing the memory is a better
    option.

    The limit is set quite low by default as a defence against software
    bugs, where infinite recursion could rapidly use up all available
    memory. Consider an implementation of the factorial function:

    double factorial (int c)
    {
    if (c == 0)
    return 1.0;
    else
    return c * factorial(c - 1);
    }

    This will work fine for valid calls, but what happens if you were
    to call factorial(-1)? The stack size limit kills this off before
    things get _too_ silly.

    --
    Andrew Smallshaw
    andrews@sdf.lonestar.org

  6. Re: C Program Execution Problem

    On Fri, 1 Feb 2008 21:19:51 +0100 (CET), Andrew Smallshaw
    wrote:

    >On 2008-02-01, Richard Seeder wrote:
    >
    >> I'm trying to port a compiled c program from 5.0.4 to 5.0.7. The
    >> program runs fine on 5.0.4 (I tried it on more than one system), but
    >> produces a memory fault on 5.0.5, 5.0.6 and 5.0.7 (I tried all of
    >> them) when it attempts to execute a subroutine which defines a very
    >> large (6MB) character array (the fault occurs on the c code line
    >> defining the array). Recompiling the code on 5.0.7 produces the same
    >> memory fault.

    >
    >> The array is defined as "char b[_A][_B][_C];" where "#define _A=432",
    >> "#define _B=56" and "#define _C=256". If I reduce A to, say, 100,
    >> then it does not produce the memory fault, so size seems to matter.
    >> But the question is, why does it work on 5.0.4 and not on later
    >> releases?

    >
    >You are hitting the maximum stack size limit. You may be able to
    >alter this using setrlimit() or ulimit from the calling shell.
    >However you will need to have root privileges to do this. Another
    >poster has already pointed out that allocating such large structures
    >on the stack is in general a bad idea. Although there are
    >circumstances where it may be appropriate for speed or to reduce
    >memory fragmentation in general malloc()ing the memory is a better
    >option.
    >
    >The limit is set quite low by default as a defence against software
    >bugs, where infinite recursion could rapidly use up all available
    >memory. Consider an implementation of the factorial function:
    >
    >double factorial (int c)
    >{
    > if (c == 0)
    > return 1.0;
    > else
    > return c * factorial(c - 1);
    >}
    >
    >This will work fine for valid calls, but what happens if you were
    >to call factorial(-1)? The stack size limit kills this off before
    >things get _too_ silly.


    Thanks; setrlimit() worked; the default under 5.0.4 for all limits is
    apparently RLIM_INFINITY, but is set much lower only for RLIMIT_STACK
    in subsequent releases. I understand your explanation (and the
    general concept that this represents bad coding), but I did not write
    these programs, I'm just porting them to an upgraded operating system.
    Consequently, I have neither the understanding of their function nor
    the time to go in and rewrite them properly.

+ Reply to Thread