Duplicate Job Question - Veritas Backup Exec

This is a discussion on Duplicate Job Question - Veritas Backup Exec ; I have v9.0 running on a library with 2 drives. I backup 11 different servers and have decided to run 11 different backup jobs, one for each different server. Because of the limit of 100 MB on my NIC, I ...

+ Reply to Thread
Results 1 to 5 of 5

Thread: Duplicate Job Question

  1. Duplicate Job Question

    I have v9.0 running on a library with 2 drives. I backup 11 different
    servers and have decided to run 11 different backup jobs, one for each
    different server. Because of the limit of 100 MB on my NIC, I only backup
    to one drive for all of the drives. If I try to run two jobs simultaneuosly
    to both drives I end up saturateing the NIC and the second drive only gets a
    throughput of 1 or 2 MB a minute which makes it basically worthless. I
    stagger the start times of each of the jobs to be approximately 1 hour
    before the end of the previous job just in case the first job runs quicker
    than expected. I have also set up 11 duplicate backup jobs to run after the
    completion of each of the main jobs.

    Is there any way to force the duplicated to wait until all the main jobs
    have completed before they start? The reason I ask is that if a job runs
    quicker than expected and the library goes idle then one of the duplicate
    jobs will kick off and slow down my whole backup schedule. I have only a
    limited amount of time to get main backups done in. The duplicates can run
    anytime.

    I have thought about changing to duplicating sessions but that means manual
    sceduling each week which I would rather avoid. I suppose I could go back
    to two jobs for each server like in version 8.6 but that still puts me up
    against my time window.

    Any suggestions would be greatly appreciated.



  2. Re: Duplicate Job Question

    You can use the BEMCMD utility to call the duplicate jobs as "Post Jobs" on
    the last regular job that runs.

    You can do the same thing to release each regular job when the previous one
    finishes.

    I am a little curious about something you said, tho. About "saturating the
    NIC". I used to have a dedicated backup server (PII 450, 640MB, two
    dedicated SCSI Channels with two DLT4000 drives on each and 100MB/Full
    Duplex.) I had only a very small detectable drop in throughput for a given
    job even when all four drives were running at the same time, all doing
    network backups.

    Sounds like the library may be mis-configured. You're on the latest
    firmware and drivers for the library and tape drives, right ? I prefer mfg
    drivers to those from Veritas or MS

    "Greg Pagan" wrote in message
    news:3ffd9e64@ROSASTDMZ05....
    > I have v9.0 running on a library with 2 drives. I backup 11 different
    > servers and have decided to run 11 different backup jobs, one for each
    > different server. Because of the limit of 100 MB on my NIC, I only backup
    > to one drive for all of the drives. If I try to run two jobs

    simultaneuosly
    > to both drives I end up saturateing the NIC and the second drive only gets

    a
    > throughput of 1 or 2 MB a minute which makes it basically worthless. I
    > stagger the start times of each of the jobs to be approximately 1 hour
    > before the end of the previous job just in case the first job runs quicker
    > than expected. I have also set up 11 duplicate backup jobs to run after

    the
    > completion of each of the main jobs.
    >
    > Is there any way to force the duplicated to wait until all the main jobs
    > have completed before they start? The reason I ask is that if a job runs
    > quicker than expected and the library goes idle then one of the duplicate
    > jobs will kick off and slow down my whole backup schedule. I have only a
    > limited amount of time to get main backups done in. The duplicates can

    run
    > anytime.
    >
    > I have thought about changing to duplicating sessions but that means

    manual
    > sceduling each week which I would rather avoid. I suppose I could go back
    > to two jobs for each server like in version 8.6 but that still puts me up
    > against my time window.
    >
    > Any suggestions would be greatly appreciated.
    >
    >




  3. Re: Duplicate Job Question

    I will check out the Documentation on BEMCMD to see how I would use this. I
    am not familiar with the command line, or post commands. I must confess for
    this application I am GUI only.

    Now as far as the saturation goes. I was told by someone here at my company
    that my new SDLT drives have the ability to read more data than my 100 MB
    card could supply hence my saturation comment. The person making the
    comment has never led me astray before so I just believed him. I do think I
    will contact Compaq though just to see if I can get them to walk me though
    the configuration to make sure I did not mess something up, which is a
    distinct possiblilty. Yes, I am running the altest firmare for both the
    library and the drives and yes I am running Compaq drivers but that just
    happened to day. I had been using Veritas drives prior to this.

    Thanks for your post.

    Greg

    "ken putnam" wrote in message
    news:3ffda2a3@ROSASTDMZ05....
    > You can use the BEMCMD utility to call the duplicate jobs as "Post Jobs"

    on
    > the last regular job that runs.
    >
    > You can do the same thing to release each regular job when the previous

    one
    > finishes.
    >
    > I am a little curious about something you said, tho. About "saturating

    the
    > NIC". I used to have a dedicated backup server (PII 450, 640MB, two
    > dedicated SCSI Channels with two DLT4000 drives on each and 100MB/Full
    > Duplex.) I had only a very small detectable drop in throughput for a

    given
    > job even when all four drives were running at the same time, all doing
    > network backups.
    >
    > Sounds like the library may be mis-configured. You're on the latest
    > firmware and drivers for the library and tape drives, right ? I prefer

    mfg
    > drivers to those from Veritas or MS
    >
    > "Greg Pagan" wrote in message
    > news:3ffd9e64@ROSASTDMZ05....
    > > I have v9.0 running on a library with 2 drives. I backup 11 different
    > > servers and have decided to run 11 different backup jobs, one for each
    > > different server. Because of the limit of 100 MB on my NIC, I only

    backup
    > > to one drive for all of the drives. If I try to run two jobs

    > simultaneuosly
    > > to both drives I end up saturateing the NIC and the second drive only

    gets
    > a
    > > throughput of 1 or 2 MB a minute which makes it basically worthless. I
    > > stagger the start times of each of the jobs to be approximately 1 hour
    > > before the end of the previous job just in case the first job runs

    quicker
    > > than expected. I have also set up 11 duplicate backup jobs to run after

    > the
    > > completion of each of the main jobs.
    > >
    > > Is there any way to force the duplicated to wait until all the main jobs
    > > have completed before they start? The reason I ask is that if a job

    runs
    > > quicker than expected and the library goes idle then one of the

    duplicate
    > > jobs will kick off and slow down my whole backup schedule. I have only

    a
    > > limited amount of time to get main backups done in. The duplicates can

    > run
    > > anytime.
    > >
    > > I have thought about changing to duplicating sessions but that means

    > manual
    > > sceduling each week which I would rather avoid. I suppose I could go

    back
    > > to two jobs for each server like in version 8.6 but that still puts me

    up
    > > against my time window.
    > >
    > > Any suggestions would be greatly appreciated.
    > >
    > >

    >
    >




  4. Re: Duplicate Job Question

    AFAIK, about all the documentation for BEMCMD is in the help file. from the
    ...\Utils dir (I think), do

    "BEMCMD /? > BEMCMD.TXT"

    This will re-direct the help output to the named text file. Once you play
    around with it till you get the cmd line parameter the way that you wnat
    them, then from the JobDescription for the appropriate jon, enter the cmd
    line in the POST JOB COMMAND box.

    I usually use a CMD file that looks something like this called "Release
    Job2.CMD" or whatever.

    c:
    cd "Program Files\Veritas\BackupExec\NT\Utils"
    (or where-ever BEMCMD lives, since there is no path defined when the Post
    Job Command executes)
    BEMCMD -xxx -parm1 -parm2

    Don't remember off the top of my head what the option to release a job is,
    but it's in the help.


    "Greg Pagan" wrote in message
    news:3ffdb6ed@ROSASTDMZ05....
    > I will check out the Documentation on BEMCMD to see how I would use this.

    I
    > am not familiar with the command line, or post commands. I must confess

    for
    > this application I am GUI only.
    >
    > Now as far as the saturation goes. I was told by someone here at my

    company
    > that my new SDLT drives have the ability to read more data than my 100 MB
    > card could supply hence my saturation comment. The person making the
    > comment has never led me astray before so I just believed him. I do think

    I
    > will contact Compaq though just to see if I can get them to walk me though
    > the configuration to make sure I did not mess something up, which is a
    > distinct possiblilty. Yes, I am running the altest firmare for both the
    > library and the drives and yes I am running Compaq drivers but that just
    > happened to day. I had been using Veritas drives prior to this.
    >
    > Thanks for your post.
    >
    > Greg
    >
    > "ken putnam" wrote in message
    > news:3ffda2a3@ROSASTDMZ05....
    > > You can use the BEMCMD utility to call the duplicate jobs as "Post Jobs"

    > on
    > > the last regular job that runs.
    > >
    > > You can do the same thing to release each regular job when the previous

    > one
    > > finishes.
    > >
    > > I am a little curious about something you said, tho. About "saturating

    > the
    > > NIC". I used to have a dedicated backup server (PII 450, 640MB, two
    > > dedicated SCSI Channels with two DLT4000 drives on each and 100MB/Full
    > > Duplex.) I had only a very small detectable drop in throughput for a

    > given
    > > job even when all four drives were running at the same time, all doing
    > > network backups.
    > >
    > > Sounds like the library may be mis-configured. You're on the latest
    > > firmware and drivers for the library and tape drives, right ? I prefer

    > mfg
    > > drivers to those from Veritas or MS
    > >
    > > "Greg Pagan" wrote in message
    > > news:3ffd9e64@ROSASTDMZ05....
    > > > I have v9.0 running on a library with 2 drives. I backup 11 different
    > > > servers and have decided to run 11 different backup jobs, one for each
    > > > different server. Because of the limit of 100 MB on my NIC, I only

    > backup
    > > > to one drive for all of the drives. If I try to run two jobs

    > > simultaneuosly
    > > > to both drives I end up saturateing the NIC and the second drive only

    > gets
    > > a
    > > > throughput of 1 or 2 MB a minute which makes it basically worthless.

    I
    > > > stagger the start times of each of the jobs to be approximately 1 hour
    > > > before the end of the previous job just in case the first job runs

    > quicker
    > > > than expected. I have also set up 11 duplicate backup jobs to run

    after
    > > the
    > > > completion of each of the main jobs.
    > > >
    > > > Is there any way to force the duplicated to wait until all the main

    jobs
    > > > have completed before they start? The reason I ask is that if a job

    > runs
    > > > quicker than expected and the library goes idle then one of the

    > duplicate
    > > > jobs will kick off and slow down my whole backup schedule. I have

    only
    > a
    > > > limited amount of time to get main backups done in. The duplicates

    can
    > > run
    > > > anytime.
    > > >
    > > > I have thought about changing to duplicating sessions but that means

    > > manual
    > > > sceduling each week which I would rather avoid. I suppose I could go

    > back
    > > > to two jobs for each server like in version 8.6 but that still puts me

    > up
    > > > against my time window.
    > > >
    > > > Any suggestions would be greatly appreciated.
    > > >
    > > >

    > >
    > >

    >
    >




  5. Re: Duplicate Job Question

    Thanks for the heads up on the documentation. I will be looking it over
    this weekend to see if I can makes heads or tails out of it.

    Greg


    "ken putnam" wrote in message
    news:3ffdbacf@ROSASTDMZ05....
    > AFAIK, about all the documentation for BEMCMD is in the help file. from

    the
    > ..\Utils dir (I think), do
    >
    > "BEMCMD /? > BEMCMD.TXT"
    >
    > This will re-direct the help output to the named text file. Once you play
    > around with it till you get the cmd line parameter the way that you wnat
    > them, then from the JobDescription for the appropriate jon, enter the cmd
    > line in the POST JOB COMMAND box.
    >
    > I usually use a CMD file that looks something like this called "Release
    > Job2.CMD" or whatever.
    >
    > c:
    > cd "Program Files\Veritas\BackupExec\NT\Utils"
    > (or where-ever BEMCMD lives, since there is no path defined when the

    Post
    > Job Command executes)
    > BEMCMD -xxx -parm1 -parm2
    >
    > Don't remember off the top of my head what the option to release a job is,
    > but it's in the help.
    >
    >
    > "Greg Pagan" wrote in message
    > news:3ffdb6ed@ROSASTDMZ05....
    > > I will check out the Documentation on BEMCMD to see how I would use

    this.
    > I
    > > am not familiar with the command line, or post commands. I must confess

    > for
    > > this application I am GUI only.
    > >
    > > Now as far as the saturation goes. I was told by someone here at my

    > company
    > > that my new SDLT drives have the ability to read more data than my 100

    MB
    > > card could supply hence my saturation comment. The person making the
    > > comment has never led me astray before so I just believed him. I do

    think
    > I
    > > will contact Compaq though just to see if I can get them to walk me

    though
    > > the configuration to make sure I did not mess something up, which is a
    > > distinct possiblilty. Yes, I am running the altest firmare for both the
    > > library and the drives and yes I am running Compaq drivers but that just
    > > happened to day. I had been using Veritas drives prior to this.
    > >
    > > Thanks for your post.
    > >
    > > Greg
    > >
    > > "ken putnam" wrote in message
    > > news:3ffda2a3@ROSASTDMZ05....
    > > > You can use the BEMCMD utility to call the duplicate jobs as "Post

    Jobs"
    > > on
    > > > the last regular job that runs.
    > > >
    > > > You can do the same thing to release each regular job when the

    previous
    > > one
    > > > finishes.
    > > >
    > > > I am a little curious about something you said, tho. About

    "saturating
    > > the
    > > > NIC". I used to have a dedicated backup server (PII 450, 640MB, two
    > > > dedicated SCSI Channels with two DLT4000 drives on each and 100MB/Full
    > > > Duplex.) I had only a very small detectable drop in throughput for a

    > > given
    > > > job even when all four drives were running at the same time, all doing
    > > > network backups.
    > > >
    > > > Sounds like the library may be mis-configured. You're on the latest
    > > > firmware and drivers for the library and tape drives, right ? I

    prefer
    > > mfg
    > > > drivers to those from Veritas or MS
    > > >
    > > > "Greg Pagan" wrote in message
    > > > news:3ffd9e64@ROSASTDMZ05....
    > > > > I have v9.0 running on a library with 2 drives. I backup 11

    different
    > > > > servers and have decided to run 11 different backup jobs, one for

    each
    > > > > different server. Because of the limit of 100 MB on my NIC, I only

    > > backup
    > > > > to one drive for all of the drives. If I try to run two jobs
    > > > simultaneuosly
    > > > > to both drives I end up saturateing the NIC and the second drive

    only
    > > gets
    > > > a
    > > > > throughput of 1 or 2 MB a minute which makes it basically worthless.

    > I
    > > > > stagger the start times of each of the jobs to be approximately 1

    hour
    > > > > before the end of the previous job just in case the first job runs

    > > quicker
    > > > > than expected. I have also set up 11 duplicate backup jobs to run

    > after
    > > > the
    > > > > completion of each of the main jobs.
    > > > >
    > > > > Is there any way to force the duplicated to wait until all the main

    > jobs
    > > > > have completed before they start? The reason I ask is that if a job

    > > runs
    > > > > quicker than expected and the library goes idle then one of the

    > > duplicate
    > > > > jobs will kick off and slow down my whole backup schedule. I have

    > only
    > > a
    > > > > limited amount of time to get main backups done in. The duplicates

    > can
    > > > run
    > > > > anytime.
    > > > >
    > > > > I have thought about changing to duplicating sessions but that means
    > > > manual
    > > > > sceduling each week which I would rather avoid. I suppose I could

    go
    > > back
    > > > > to two jobs for each server like in version 8.6 but that still puts

    me
    > > up
    > > > > against my time window.
    > > > >
    > > > > Any suggestions would be greatly appreciated.
    > > > >
    > > > >
    > > >
    > > >

    > >
    > >

    >
    >




+ Reply to Thread