job log entries - IBM AS400

This is a discussion on job log entries - IBM AS400 ; We have a trigger on our Customer file. When ever an update batch job runs over the Customer we get tons of joblog entries with level 10 msg regarding the trigger program. What controls the level severity on a submitted ...

+ Reply to Thread
Results 1 to 9 of 9

Thread: job log entries

  1. job log entries

    We have a trigger on our Customer file. When ever an update batch job
    runs over the Customer we get tons of joblog entries with level 10 msg
    regarding the trigger program.

    What controls the level severity on a submitted job? Suppose we only
    want to log level 20+ msg? Is it the jobd? Or one of the parms on
    submjob?

    Is the trigger invovled at all in controlling what log level to
    capture?

    Any suggestions as to how to get long running batch jobs to run faster
    with trigger files?

    Much appreciated.

    Eve

  2. Re: job log entries

    The SBMJOB command has a Message Logging parameter section (LOG), this
    defaults to *JOBD
    You might want to look at the LOGCLPGM parameter too.

  3. Re: job log entries

    On May 2, 2:43*pm, eknight wrote:
    > We have a trigger on our Customer file. When ever an update batch job
    > runs over the Customer we get tons of joblog entries with level 10 msg
    > regarding the trigger program.
    >
    > What controls the level severity on a submitted job? Suppose we only
    > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > submjob?
    >
    > Is the trigger invovled at all in controlling what log level to
    > capture?
    >
    > Any suggestions as to how to get long running batch jobs to run faster
    > with trigger files?
    >
    > Much appreciated.
    >
    > Eve


    How about sbmjob log(4 20 *seclvl) .... ?
    You may want to experiment with the values. Obviously any chgjob
    log(...) command in the running job or maybe calls to some apis would
    have an impact on this.
    Your trigger program ca e speeded up in several ways depending on what
    its doing.
    One way to get RPG programs which are called frequently to work faster
    is to not seton lr when leaving the program. This leaves the files
    open & does not initialise variables so some additional care is
    required. There may be a similar feature in Cobol or C but I dont
    think so for SQL triggers. There is a downside to this in that anyone
    say DFUing a single record at the start of the day might then have the
    program activated & unused until they next logoff, so you might want
    to control this feature with a dataarea for example. One bit of RPG
    code I use at the start of such programs is like this:
    if %parms=0
    seton lr
    return
    endif
    That way I can kill the program while debugging just by calling it
    with no parameters.
    You would want to be even more careful about allocating things such as
    memory which if not dealocated are normally cleared up when an
    activation group or job ends, this may not be your problem in this
    case.
    Once your program is left open it starts to make sense to cache
    unchanging values such as descriptions from tables etc. which
    otherwise might be accessed many times.

    HTH
    Jonathan

  4. Re: job log entries

    On May 2, 10:37*am, Jonathan Bailey wrote:
    > On May 2, 2:43*pm, eknight wrote:
    >
    >
    >
    >
    >
    > > We have a trigger on our Customer file. When ever an update batch job
    > > runs over the Customer we get tons of joblog entries with level 10 msg
    > > regarding the trigger program.

    >
    > > What controls the level severity on a submitted job? Suppose we only
    > > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > > submjob?

    >
    > > Is the trigger invovled at all in controlling what log level to
    > > capture?

    >
    > > Any suggestions as to how to get long running batch jobs to run faster
    > > with trigger files?

    >
    > > Much appreciated.

    >
    > > Eve

    >
    > How about sbmjob log(4 20 *seclvl) .... ?
    > You may want to experiment with the values. Obviously any chgjob
    > log(...) command in the running job or maybe calls to some apis would
    > have an impact on this.
    > Your trigger program ca e speeded up in several ways depending on what
    > its doing.
    > One way to get RPG programs which are called frequently to work faster
    > is to not seton lr when leaving the program. This leaves the files
    > open & does not initialise variables so some additional care is
    > required. There may be a similar feature in Cobol or C but I dont
    > think so for SQL triggers. There is a downside to this in that anyone
    > say DFUing a single record at the start of the day might then have the
    > program activated & unused until they next logoff, so you might want
    > to control this feature with a dataarea for example. One bit of RPG
    > code I use at the start of such programs is like this:
    > if %parms=0
    > seton lr
    > return
    > endif
    > That way I can kill the program while debugging just by calling it
    > with no parameters.
    > You would want to be even more careful about allocating things such as
    > memory which if not dealocated are normally cleared up when an
    > activation group or job ends, this may not be your problem in this
    > case.
    > Once your program is left open it starts to make sense to cache
    > unchanging values such as descriptions from tables etc. which
    > otherwise might be accessed many times.
    >
    > HTH
    > Jonathan- Hide quoted text -
    >
    > - Show quoted text -


    Thank you very much. I'll definately use some of these tips. Eve

  5. Re: job log entries

    On May 2, 10:37*am, Jonathan Bailey wrote:
    > On May 2, 2:43*pm, eknight wrote:
    >
    >
    >
    >
    >
    > > We have a trigger on our Customer file. When ever an update batch job
    > > runs over the Customer we get tons of joblog entries with level 10 msg
    > > regarding the trigger program.

    >
    > > What controls the level severity on a submitted job? Suppose we only
    > > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > > submjob?

    >
    > > Is the trigger invovled at all in controlling what log level to
    > > capture?

    >
    > > Any suggestions as to how to get long running batch jobs to run faster
    > > with trigger files?

    >
    > > Much appreciated.

    >
    > > Eve

    >
    > How about sbmjob log(4 20 *seclvl) .... ?
    > You may want to experiment with the values. Obviously any chgjob
    > log(...) command in the running job or maybe calls to some apis would
    > have an impact on this.
    > Your trigger program ca e speeded up in several ways depending on what
    > its doing.
    > One way to get RPG programs which are called frequently to work faster
    > is to not seton lr when leaving the program. This leaves the files
    > open & does not initialise variables so some additional care is
    > required. There may be a similar feature in Cobol or C but I dont
    > think so for SQL triggers. There is a downside to this in that anyone
    > say DFUing a single record at the start of the day might then have the
    > program activated & unused until they next logoff, so you might want
    > to control this feature with a dataarea for example. One bit of RPG
    > code I use at the start of such programs is like this:
    > if %parms=0
    > seton lr
    > return
    > endif
    > That way I can kill the program while debugging just by calling it
    > with no parameters.
    > You would want to be even more careful about allocating things such as
    > memory which if not dealocated are normally cleared up when an
    > activation group or job ends, this may not be your problem in this
    > case.
    > Once your program is left open it starts to make sense to cache
    > unchanging values such as descriptions from tables etc. which
    > otherwise might be accessed many times.
    >
    > HTH
    > Jonathan- Hide quoted text -
    >
    > - Show quoted text -


    Can you explain tfurther what you mean in your last paragraph about
    'Once your program is left open..'

  6. Re: job log entries

    On May 2, 11:18*am, eknight wrote:
    > On May 2, 10:37*am, Jonathan Bailey wrote:
    >
    >
    >
    >
    >
    > > On May 2, 2:43*pm, eknight wrote:

    >
    > > > We have a trigger on our Customer file. When ever an update batch job
    > > > runs over the Customer we get tons of joblog entries with level 10 msg
    > > > regarding the trigger program.

    >
    > > > What controls the level severity on a submitted job? Suppose we only
    > > > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > > > submjob?

    >
    > > > Is the trigger invovled at all in controlling what log level to
    > > > capture?

    >
    > > > Any suggestions as to how to get long running batch jobs to run faster
    > > > with trigger files?

    >
    > > > Much appreciated.

    >
    > > > Eve

    >
    > > How about sbmjob log(4 20 *seclvl) .... ?
    > > You may want to experiment with the values. Obviously any chgjob
    > > log(...) command in the running job or maybe calls to some apis would
    > > have an impact on this.
    > > Your trigger program ca e speeded up in several ways depending on what
    > > its doing.
    > > One way to get RPG programs which are called frequently to work faster
    > > is to not seton lr when leaving the program. This leaves the files
    > > open & does not initialise variables so some additional care is
    > > required. There may be a similar feature in Cobol or C but I dont
    > > think so for SQL triggers. There is a downside to this in that anyone
    > > say DFUing a single record at the start of the day might then have the
    > > program activated & unused until they next logoff, so you might want
    > > to control this feature with a dataarea for example. One bit of RPG
    > > code I use at the start of such programs is like this:
    > > if %parms=0
    > > seton lr
    > > return
    > > endif
    > > That way I can kill the program while debugging just by calling it
    > > with no parameters.
    > > You would want to be even more careful about allocating things such as
    > > memory which if not dealocated are normally cleared up when an
    > > activation group or job ends, this may not be your problem in this
    > > case.
    > > Once your program is left open it starts to make sense to cache
    > > unchanging values such as descriptions from tables etc. which
    > > otherwise might be accessed many times.

    >
    > > HTH
    > > Jonathan- Hide quoted text -

    >
    > > - Show quoted text -

    >
    > Can you explain tfurther what you mean in your last paragraph about
    > 'Once your program is left open..'- Hide quoted text -
    >
    > - Show quoted text -


    Our job ran 5 hours with the trigger on and was only 1/2 thru the
    file. We took the trigger off and it ran in 2 hours. Obviously we
    need to make some changes. Thanks everyone.

  7. Re: job log entries

    On May 2, 4:18*pm, eknight wrote:
    > On May 2, 10:37*am, Jonathan Bailey wrote:
    >
    >
    >
    >
    >
    > > On May 2, 2:43*pm, eknight wrote:

    >
    > > > We have a trigger on our Customer file. When ever an update batch job
    > > > runs over the Customer we get tons of joblog entries with level 10 msg
    > > > regarding the trigger program.

    >
    > > > What controls the level severity on a submitted job? Suppose we only
    > > > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > > > submjob?

    >
    > > > Is the trigger invovled at all in controlling what log level to
    > > > capture?

    >
    > > > Any suggestions as to how to get long running batch jobs to run faster
    > > > with trigger files?

    >
    > > > Much appreciated.

    >
    > > > Eve

    >
    > > How about sbmjob log(4 20 *seclvl) .... ?
    > > You may want to experiment with the values. Obviously any chgjob
    > > log(...) command in the running job or maybe calls to some apis would
    > > have an impact on this.
    > > Your trigger program ca e speeded up in several ways depending on what
    > > its doing.
    > > One way to get RPG programs which are called frequently to work faster
    > > is to not seton lr when leaving the program. This leaves the files
    > > open & does not initialise variables so some additional care is
    > > required. There may be a similar feature in Cobol or C but I dont
    > > think so for SQL triggers. There is a downside to this in that anyone
    > > say DFUing a single record at the start of the day might then have the
    > > program activated & unused until they next logoff, so you might want
    > > to control this feature with a dataarea for example. One bit of RPG
    > > code I use at the start of such programs is like this:
    > > if %parms=0
    > > seton lr
    > > return
    > > endif
    > > That way I can kill the program while debugging just by calling it
    > > with no parameters.
    > > You would want to be even more careful about allocating things such as
    > > memory which if not dealocated are normally cleared up when an
    > > activation group or job ends, this may not be your problem in this
    > > case.
    > > Once your program is left open it starts to make sense to cache
    > > unchanging values such as descriptions from tables etc. which
    > > otherwise might be accessed many times.

    >
    > > HTH
    > > Jonathan

    >
    > Can you explain tfurther what you mean in your last paragraph about
    > 'Once your program is left open..'- Hide quoted text -
    >
    > - Show quoted text -


    When I say left open I mean when the RPG returns with *INLR setoff.
    This means the next call to the program finds all the variables, files
    etc in the same state as before, uninitialized & file pointers not set
    to start of file. You should probably try to close the program by
    calling it & having it seton *INLR before returning once you are no
    longer going to need it in the open state. I made an invoice run
    reduce from a couple of hours to a few minutes by using this method in
    the price program. It was called for each line of every invoice & in
    spite of doing multiple chains & not cacheing the pricelists the
    customer was eligible to use made a huge difference to the runtime. I
    think you will find the trigger program is locked by any job which has
    made an update to the file but not yet closed the file but I think
    that is the database trying to ensure consistency.

    Jonathan.

  8. Re: job log entries

    eknight wrote:
    > We have a trigger on our Customer file. When ever an update batch job
    > runs over the Customer we get tons of joblog entries with level 10 msg
    > regarding the trigger program.
    >
    > What controls the level severity on a submitted job? Suppose we only
    > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > submjob?
    >
    > Is the trigger invovled at all in controlling what log level to
    > capture?
    >
    > Any suggestions as to how to get long running batch jobs to run faster
    > with trigger files?
    >
    > Much appreciated.
    >
    > Eve


    The job attribute you are interested is LOG. Both the JOBD and the
    SBMJOB have a LOG attribute/parameter. SBMJOB will override the JOBD
    value...typically the SBMJOB has *JOBD for LOG.

    Check the batch job attributes to determine the jobd that is used to
    submit the batch job. If the jobd used to submit the batch job doesn't
    have level 10 specified, then it is likely the program doing the sbmjob
    is overriding the value and you'd need to find that program and modify
    that program.

    Note that you can change the LOG level once the job is submitted
    (provided you have the authority), however that is only a temporary
    solution.

    --
    Rodney A Johnson
    Working on a new project. Former technical team Lead for i5/OS (AS/400)
    Spool
    Dept 33A
    IBM Rochester, Minnesota

    The contents of this message express only the sender's opinion.
    This message does not necessarily reflect the policy or views of
    my employer, IBM. All responsibility for the statements
    made in this Usenet posting resides solely and completely with the
    sender.

  9. Re: job log entries

    On May 2, 11:49*am, "Rodney A. Johnson" wrote:
    > eknight wrote:
    > > We have a trigger on our Customer file. When ever an update batch job
    > > runs over the Customer we get tons of joblog entries with level 10 msg
    > > regarding the trigger program.

    >
    > > What controls the level severity on a submitted job? Suppose we only
    > > want to log level 20+ msg? Is it the jobd? Or one of the parms on
    > > submjob?

    >
    > > Is the trigger invovled at all in controlling what log level to
    > > capture?

    >
    > > Any suggestions as to how to get long running batch jobs to run faster
    > > with trigger files?

    >
    > > Much appreciated.

    >
    > > Eve

    >
    > The job attribute you are interested is LOG. *Both the JOBD and the
    > SBMJOB have a LOG attribute/parameter. *SBMJOB will override the JOBD
    > value...typically the SBMJOB has *JOBD for LOG.
    >
    > Check the batch job attributes to determine the jobd that is used to
    > submit the batch job. *If the jobd used to submit the batch job doesn't
    > have level 10 specified, then it is likely the program doing the sbmjob
    > is overriding the value and you'd need to find that program and modify
    > that program.
    >
    > Note that you can change the LOG level once the job is submitted
    > (provided you have the authority), however that is only a temporary
    > solution.
    >
    > --
    > Rodney A Johnson
    > Working on a new project. *Former technical team Lead for i5/OS (AS/400)
    > Spool
    > Dept 33A
    > IBM * Rochester, Minnesota
    >
    > The contents of this message express only the sender's opinion.
    > This message does not necessarily reflect the policy or views of
    > my employer, IBM. *All responsibility for the statements
    > made in this Usenet posting resides solely and completely with the
    > sender.- Hide quoted text -
    >
    > - Show quoted text -


    Thanks for the info. I just tried to change the message logging on
    the submitted job. The job was running an update to the customer file
    which has a trigger on it. I put the severity up to 50 and we always
    had *nolist specified anyway. Everytime the trigger is executed the
    open of the file inside the trigger program is causing an entry on the
    joblog. These entries are only diagnostic level 10's. So why are they
    appearing on the joblog when we specify *nolist?

+ Reply to Thread