BDOS+BIOS problem - CP/M

This is a discussion on BDOS+BIOS problem - CP/M ; Hi CP/M users (re-posted as message was in wrong thread) For some time now I have been trying to use a Compact Flash as a hard drive with my Interak computer, using CP/M 2.2 The hardware works fine i.e. I ...

+ Reply to Thread
Results 1 to 3 of 3

Thread: BDOS+BIOS problem

  1. BDOS+BIOS problem

    Hi CP/M users
    (re-posted as message was in wrong thread)

    For some time now I have been trying to use a Compact Flash as a hard
    drive with my Interak computer, using CP/M 2.2
    The hardware works fine i.e. I can read and write data to the CF. The
    difficult bit has been integrating the CP/M file system to accept the
    CF as part of the computer working with the 2 floppy disks.
    Initially I could not even get a C: prompt to appear on the screen. At
    present I can now store files on the CF and use NewSweep to copy files
    from floppy to CF and back again. Also disk utility programs can read
    sectors and tracks OK.

    The problem I have been struggling with for some time is - When a file
    is written to disk the de-blocking works until the end of the file. At
    this point any data in the host buffer ( which holds 512 bytes - 4 x
    128 byte CP/M sectors ) is lost when the directory entry is written.
    When reading flies the correct size is read back which means that the
    last 128 to 512 bytes are corrupt. Depending where the file has ended
    either 1,2,3 or 4 blocks of 128 bytes are still in the host buffer.
    It seems that while writing a file the first indication that the end-
    of-file has occurred is when a write type 1 is sent (write to
    directory). At this point the directory has already been copied into
    the host buffer - overwriting the final data.

    I have several CP/M books but I cannot find sufficient information
    regarding this.
    One method, which I think may be the best way to go, is to read File
    Control Block 2 before BDOS starts writing. This has record size
    information which could be compared to the current record as it is
    written. Then when a match occurs the host buffer could be immediately
    written out to the CF.

    Am I on the right track here ? Or is there a better way to go ?

    Any help greatly appreciated, Alan


  2. Re: BDOS+BIOS problem

    On Sep 9, 4:19 pm, interak wrote:
    > Hi CP/M users
    > (re-posted as message was in wrong thread)
    >
    > For some time now I have been trying to use a Compact Flash as a hard
    > drive with my Interak computer, using CP/M 2.2
    > The hardware works fine i.e. I can read and write data to the CF. The
    > difficult bit has been integrating the CP/M file system to accept the
    > CF as part of the computer working with the 2 floppy disks.
    > Initially I could not even get a C: prompt to appear on the screen. At
    > present I can now store files on the CF and use NewSweep to copy files
    > from floppy to CF and back again. Also disk utility programs can read
    > sectors and tracks OK.
    >
    > The problem I have been struggling with for some time is - When a file
    > is written to disk the de-blocking works until the end of the file. At
    > this point any data in the host buffer ( which holds 512 bytes - 4 x
    > 128 byte CP/M sectors ) is lost when the directory entry is written.
    > When reading flies the correct size is read back which means that the
    > last 128 to 512 bytes are corrupt. Depending where the file has ended
    > either 1,2,3 or 4 blocks of 128 bytes are still in the host buffer.
    > It seems that while writing a file the first indication that the end-
    > of-file has occurred is when a write type 1 is sent (write to
    > directory). At this point the directory has already been copied into
    > the host buffer - overwriting the final data.
    >
    > I have several CP/M books but I cannot find sufficient information
    > regarding this.
    > One method, which I think may be the best way to go, is to read File
    > Control Block 2 before BDOS starts writing. This has record size
    > information which could be compared to the current record as it is
    > written. Then when a match occurs the host buffer could be immediately
    > written out to the CF.
    >
    > Am I on the right track here ? Or is there a better way to go ?
    >
    > Any help greatly appreciated, Alan


    I have been using a CF card with my CP/M system for several years
    without problem. The problems you describe seem like a deblocking
    algorithm bug. I am using the deblocking algorithm supplied by DR.
    Are you using the same? I am not sure of the details, but some claim
    that there is a subtle bug in this code. However I so far have not
    experienced any problems. Perhaps others can weigh in on this
    subject. I also have a copy of the DR deblocking algorithm modified
    by John Baker that claims to have fixed the bug.

    To make things simpler, I have set up my system so that the floppies
    and CF both use the same physical sector size of 512 bytes. While it
    is possible to use different sizes you need to be careful when you
    switch between the two. I chose to avoid any problems by making them
    the same. This also help minimize the BIOS code size. Previous to
    adding the CF card to my system, I had floppies with 1024 byte
    sectors.

    I used NZCOM to help me transition between 1024 and 512 byte floppy
    sector sizes. I modified NZBIOS by adding code to handle only 512
    byte sector disk drives and broke a 32 MB CF card into 4, 8 MB logical
    drives. I set the NZBIOS up to intercept references to my second
    floppy and the CF card. All references to the first floppy were
    passed on to the "regular" BIOS. This required a separate copy of the
    deblocking code in NZBIOS as well as a separate disk buffer. Once
    this was working, I was able to copy everything from my 1024 byte
    sectored floppies to the CF card and also back them up on 512 byte
    sector floppies. You would not want to operate this way because you
    loose a substantial amount of TPA due to the redundant disk BIOS in
    NZBIOS, but its fine for just copying floppies to the CF card. When
    the backup operation was complete, I modified my system to boot from
    512 byte sectors and moved the disk routines from NZBIOS to the
    regular bios.

    Lars


  3. Re: BDOS+BIOS problem

    interak wrote:
    > Hi CP/M users
    > (re-posted as message was in wrong thread)
    >
    > For some time now I have been trying to use a Compact Flash as a hard
    > drive with my Interak computer, using CP/M 2.2
    > The hardware works fine i.e. I can read and write data to the CF. The
    > difficult bit has been integrating the CP/M file system to accept the
    > CF as part of the computer working with the 2 floppy disks.
    > Initially I could not even get a C: prompt to appear on the screen. At
    > present I can now store files on the CF and use NewSweep to copy files
    > from floppy to CF and back again. Also disk utility programs can read
    > sectors and tracks OK.
    >
    > The problem I have been struggling with for some time is - When a file
    > is written to disk the de-blocking works until the end of the file. At
    > this point any data in the host buffer ( which holds 512 bytes - 4 x
    > 128 byte CP/M sectors ) is lost when the directory entry is written.
    > When reading flies the correct size is read back which means that the
    > last 128 to 512 bytes are corrupt. Depending where the file has ended
    > either 1,2,3 or 4 blocks of 128 bytes are still in the host buffer.
    > It seems that while writing a file the first indication that the end-
    > of-file has occurred is when a write type 1 is sent (write to
    > directory). At this point the directory has already been copied into
    > the host buffer - overwriting the final data.
    >
    > I have several CP/M books but I cannot find sufficient information
    > regarding this.
    > One method, which I think may be the best way to go, is to read File
    > Control Block 2 before BDOS starts writing. This has record size
    > information which could be compared to the current record as it is
    > written. Then when a match occurs the host buffer could be immediately
    > written out to the CF.
    >
    > Am I on the right track here ? Or is there a better way to go ?
    >
    > Any help greatly appreciated, Alan
    >


    one thing you will want to check on if you are doing this on your own
    too, is to be certain that you do a read, then modify, then write back
    the new sector. If you do it that way for every cycle, it will be slow,
    but it also would ensure that the data is correct. For a cache, I would
    do a set of "ping-pong" buffers... that is to say, use 2 sector buffers
    and track which one is related to which physical sector, and simply
    write out the oldest of the two when you need a new one.

    Hope this helps.

+ Reply to Thread