Historical reason for implicit VR? - DICOM

This is a discussion on Historical reason for implicit VR? - DICOM ; I have great difficulties explaining why there is both implicit and explict VR transfer syntaxes in DICOM. This really only adds complexity and cause for problems and no real benefit. But there must be a reason why someone put it ...

+ Reply to Thread
Results 1 to 11 of 11

Thread: Historical reason for implicit VR?

  1. Historical reason for implicit VR?

    I have great difficulties explaining why there is both implicit and
    explict VR transfer syntaxes in DICOM. This really only adds
    complexity and cause for problems and no real benefit. But there must
    be a reason why someone put it there in the first place. Does anyone
    know?

    I looked in a copy of the DICOM standard from 1993 and it was there
    too, so it might be inherited from ACR-NEMA 2.0?

  2. Re: Historical reason for implicit VR?

    The ACR-NEMA work started more than 20 years ago,
    before the age of the personal computer. (Part 9 of the
    Standard, retired in 2001, was a 50pin electrical interface
    specification.) CPU, memory and bandwidth were
    expensive resources. Saving some memory and bandwidth
    was a motivation for implict transfer syntax.

    Far more annoying in my experience is the presence of
    both little and big endian transfer syntaxes. I would love
    to see DICOM retire the big endian transfer syntax.

    - Doug

    "Krister Valtonen" wrote in message
    news:9538c5c9.0410072336.5041e88e@posting.google.c om...
    >I have great difficulties explaining why there is both implicit and
    > explict VR transfer syntaxes in DICOM. This really only adds
    > complexity and cause for problems and no real benefit. But there must
    > be a reason why someone put it there in the first place. Does anyone
    > know?
    >
    > I looked in a copy of the DICOM standard from 1993 and it was there
    > too, so it might be inherited from ACR-NEMA 2.0?




  3. Re: Historical reason for implicit VR?

    Hello all

    I agree with Doug about the retirement of Big Endian Explicit. I've
    never seen this transfer syntax used, except in some private Transfer
    Syntaxes.
    I agree also that Explicit is far better than Implicit today.

    However, why ACR-NEMA #1 and #2 in the early 80's preferred Little
    Endian to Big Endian ? This is not the logical byte ordering.

    As far as I know, only INTEL and some DIGITAL Eqt. platforms natively
    work using Little Endian. Others are mainly natively based on Big
    Endian encoding.

    But when ACR-NEMA #1 appeared, INTEL platforms were not so popular as
    today.
    So why such a choice ?
    Could some ACR-NEMA/DICOM historians help us about this ? :-))

    Gilles

    "Doug Sluis" wrote in message news:...
    > The ACR-NEMA work started more than 20 years ago,
    > before the age of the personal computer. (Part 9 of the
    > Standard, retired in 2001, was a 50pin electrical interface
    > specification.) CPU, memory and bandwidth were
    > expensive resources. Saving some memory and bandwidth
    > was a motivation for implict transfer syntax.
    >
    > Far more annoying in my experience is the presence of
    > both little and big endian transfer syntaxes. I would love
    > to see DICOM retire the big endian transfer syntax.
    >
    > - Doug
    >
    > "Krister Valtonen" wrote in message
    > news:9538c5c9.0410072336.5041e88e@posting.google.c om...
    > >I have great difficulties explaining why there is both implicit and
    > > explict VR transfer syntaxes in DICOM. This really only adds
    > > complexity and cause for problems and no real benefit. But there must
    > > be a reason why someone put it there in the first place. Does anyone
    > > know?
    > >
    > > I looked in a copy of the DICOM standard from 1993 and it was there
    > > too, so it might be inherited from ACR-NEMA 2.0?


  4. Re: Historical reason for implicit VR?

    I don't think it's as much of a question about ACR-NEMA history
    as it is about the computing industry in general.

    The DEC PDP, LSI-11 and VAX architectures were extremely popular
    back then. Also much of the early development in medical imaging
    relied on developing special computing hardware because the
    commerical off the shelf technology was very inadequate for moving
    and processing large images. The Intel microprocessors and DEC
    LSI-1 micro and mini platforms were very well suited for
    engineering embedded solutions for medical imaging.

    So there were plenty of individuals more accustomed and
    inclined toward little-endian. The big-endian folks were
    probably those who favored Motorola 68000.

    - Doug

    "Gilles Mevel" wrote in message
    news:36f34f80.0410090915.5f329286@posting.google.c om...
    > Hello all
    >
    > I agree with Doug about the retirement of Big Endian Explicit. I've
    > never seen this transfer syntax used, except in some private Transfer
    > Syntaxes.
    > I agree also that Explicit is far better than Implicit today.
    >
    > However, why ACR-NEMA #1 and #2 in the early 80's preferred Little
    > Endian to Big Endian ? This is not the logical byte ordering.
    >
    > As far as I know, only INTEL and some DIGITAL Eqt. platforms natively
    > work using Little Endian. Others are mainly natively based on Big
    > Endian encoding.
    >
    > But when ACR-NEMA #1 appeared, INTEL platforms were not so popular as
    > today.
    > So why such a choice ?
    > Could some ACR-NEMA/DICOM historians help us about this ? :-))
    >
    > Gilles
    >
    > "Doug Sluis" wrote in message
    > news:...
    >> The ACR-NEMA work started more than 20 years ago,
    >> before the age of the personal computer. (Part 9 of the
    >> Standard, retired in 2001, was a 50pin electrical interface
    >> specification.) CPU, memory and bandwidth were
    >> expensive resources. Saving some memory and bandwidth
    >> was a motivation for implict transfer syntax.
    >>
    >> Far more annoying in my experience is the presence of
    >> both little and big endian transfer syntaxes. I would love
    >> to see DICOM retire the big endian transfer syntax.
    >>
    >> - Doug
    >>
    >> "Krister Valtonen" wrote in message
    >> news:9538c5c9.0410072336.5041e88e@posting.google.c om...
    >> >I have great difficulties explaining why there is both implicit and
    >> > explict VR transfer syntaxes in DICOM. This really only adds
    >> > complexity and cause for problems and no real benefit. But there must
    >> > be a reason why someone put it there in the first place. Does anyone
    >> > know?
    >> >
    >> > I looked in a copy of the DICOM standard from 1993 and it was there
    >> > too, so it might be inherited from ACR-NEMA 2.0?




  5. Re: Historical reason for implicit VR?

    Hi Doug,

    > The ACR-NEMA work started more than 20 years ago,
    > before the age of the personal computer. (Part 9 of the
    > Standard, retired in 2001, was a 50pin electrical interface
    > specification.) CPU, memory and bandwidth were
    > expensive resources. Saving some memory and bandwidth
    > was a motivation for implict transfer syntax.


    >From the bandwidth point of view, the Implicit VR transfer syntax is

    not "cheaper" than the explicit VR transfer syntax. While the implicit
    transfer syntax indeed saved the 2 bytes of the VR, it spent them on
    the length field that is always 4 bytes, instead of 2 bytes in explicit
    VR transfer syntax for most VRs.

    >From the memory point of view the implicit VR transfer syntax not only

    that didn't save anything, but it required far more memory then the
    explicit transfer syntax, since it required each implementation to keep
    the entire Dicom tags dictionary in the memory.
    Explicit transfer syntax can save the need to keep a relatively big
    dictionary in the memory.
    I believe that in that times a dictionary of a few thousands tags (at
    least 5 bytes each entry: tag+VR code) was quite problematic.

    Also from the CPU point of view, searching for VR in a dictionary is
    more expensive than reading two bytes from a stream.

    > Far more annoying in my experience is the presence of
    > both little and big endian transfer syntaxes. I would love
    > to see DICOM retire the big endian transfer syntax.


    SunOS/Solaris on sparc processor, Java and if I'm not wrong also MacOS
    on Motorola processor use natively Big Endian.
    I am aware of not few PACS implementors for SUN computers, that use the
    explicit VR Big Endian transfer syntax as their preferred transfer
    syntax, since it exempts them of endless & unnecessary conversions
    between big endian to little endian & vice versa.
    IMHO, attempts of retiring the big endian transfer syntax will be
    blocked by these implementors.

    Regards,
    Leon


  6. Re: Historical reason for implicit VR?

    Hi Leon,

    I don't see that there is anything to lose by deprecating big endian.
    Even if you are a vendor using big endian architectures, you
    are better off eliminating the big endian transfer syntax.
    So I don't expect vendors to oppose this. (I've queried DICOM
    representatives about this. Many are strongly in favor of it.
    And some aren't sure, or don't care much. No one dismissed the idea.).

    Here's the reasoning:

    1) There is no interoperability impact if DICOM deprecates big endian.
    Big endian is merely an option in the association negotiation.
    DICOM insists on offering implicit little endian. IHE profiles
    call for explicit little endian.

    2) There is no motivation to accept big-endian.
    (Perhaps big-endian might have tiny performance advantages between
    big-endian systems but this seems like a weak argument.)

    3) The reality is that many products do not support big endian, even when
    the
    system platform is big endian. There is strong motivation to avoid big
    endian
    because of 1) the extra cost to test, and 2) unnecessary risk of failure.
    I have seen big endian implementations fail to correctly swap bytes
    in specific cases. Worse, vendors often do not adequately test big endian
    (or forget to test at all). Therefore, even if you have gone through the
    extra cost in time and effort to correctly support big endian, there is
    still
    vulnerability to failure due to bad implementations in the field. Why
    bother?

    As an implementer you're better off avoiding big endian altogether.
    And DICOM will be better off without it as well.

    Yes, you're right that the 4 byte value length eliminates the
    compactness advantage for nearlly every VR. I seem to recollect
    a difference in the early ACR-NEMA but I'm probably mistaken.
    However, not every type of application necessarily needs to know
    the VR so this is not necessarily a runtime memory penalty.

    - Doug

    "leon" wrote in message
    news:1097595658.567759.62010@f14g2000cwb.googlegro ups.com...
    > Hi Doug,
    >
    >> The ACR-NEMA work started more than 20 years ago,
    >> before the age of the personal computer. (Part 9 of the
    >> Standard, retired in 2001, was a 50pin electrical interface
    >> specification.) CPU, memory and bandwidth were
    >> expensive resources. Saving some memory and bandwidth
    >> was a motivation for implict transfer syntax.

    >
    >>From the bandwidth point of view, the Implicit VR transfer syntax is

    > not "cheaper" than the explicit VR transfer syntax. While the implicit
    > transfer syntax indeed saved the 2 bytes of the VR, it spent them on
    > the length field that is always 4 bytes, instead of 2 bytes in explicit
    > VR transfer syntax for most VRs.
    >
    >>From the memory point of view the implicit VR transfer syntax not only

    > that didn't save anything, but it required far more memory then the
    > explicit transfer syntax, since it required each implementation to keep
    > the entire Dicom tags dictionary in the memory.
    > Explicit transfer syntax can save the need to keep a relatively big
    > dictionary in the memory.
    > I believe that in that times a dictionary of a few thousands tags (at
    > least 5 bytes each entry: tag+VR code) was quite problematic.
    >
    > Also from the CPU point of view, searching for VR in a dictionary is
    > more expensive than reading two bytes from a stream.
    >
    >> Far more annoying in my experience is the presence of
    >> both little and big endian transfer syntaxes. I would love
    >> to see DICOM retire the big endian transfer syntax.

    >
    > SunOS/Solaris on sparc processor, Java and if I'm not wrong also MacOS
    > on Motorola processor use natively Big Endian.
    > I am aware of not few PACS implementors for SUN computers, that use the
    > explicit VR Big Endian transfer syntax as their preferred transfer
    > syntax, since it exempts them of endless & unnecessary conversions
    > between big endian to little endian & vice versa.
    > IMHO, attempts of retiring the big endian transfer syntax will be
    > blocked by these implementors.
    >
    > Regards,
    > Leon
    >




  7. Re: Historical reason for implicit VR?

    "Doug Sluis" wrote in message
    news:IAy9d.3$y77.0@trnddc05...
    > The ACR-NEMA work started more than 20 years ago,
    > before the age of the personal computer. (Part 9 of the
    > Standard, retired in 2001, was a 50pin electrical interface
    > specification.) CPU, memory and bandwidth were
    > expensive resources. Saving some memory and bandwidth
    > was a motivation for implict transfer syntax.


    And as I recall the interface so specified was in fact a DEC DR11-W
    interface, which perhaps further explains the bin-endian stuff... DEC was
    big-endian.

    I was once told (forget by whom) that under the Regan administration the
    whole idea of standards was deprecated by the government except in so far as
    they related to safety, so ACR-NEMA 2.0 pretended to be a specification for
    an electrical connection (basically just a plug/socket) because that was
    less controversial than a specification of an entire data exchange protocol.
    This may of course be BS.



  8. Re: Historical reason for implicit VR?

    "Stephen J. Hart" wrote in message news:...
    > I was once told (forget by whom) that under the Regan administration the
    > whole idea of standards was deprecated by the government except in so far as
    > they related to safety, so ACR-NEMA 2.0 pretended to be a specification for
    > an electrical connection (basically just a plug/socket) because that was
    > less controversial than a specification of an entire data exchange protocol.
    > This may of course be BS.


    Interesting. Somewhat wandering off topic but the anecdote regarding
    standards seems to be litle distorted but essentially on target.
    During the Reagan administration there was a protocol, sponsored by
    NEMA as I recall, for electrical appliances to be able to exchange
    information over the power lines. Examples cited for its usage would
    be things like clock synchronization so your VCR wouldn't be blinking
    12:00 12:00 12:00 after every power failure (this was a major consumer
    problem during the 80's). Neither would you have to reset all the
    clocks in the house for springing into or falling back from daylight
    savings time. The appliances would be able to receive a time
    synchonization signal from a master source. Sounds an awful lot like
    NTP. Anyway, for some reason this protocol needed anti-trust or
    anti-collusion legislation for the different manufactures to cooperate
    in implementing it. Reagan vetoed the legislation, over the objections
    of the major electrical device manufactures, because he viewed it as
    an unfunded mandate from the government to private industry.

  9. Re: Historical reason for implicit VR?

    Hi Gilles

    Gilles Mevel wrote:

    > However, why ACR-NEMA #1 and #2 in the early 80's preferred Little
    > Endian to Big Endian ? This is not the logical byte ordering.


    ACR-NEMA expressed no preference in this regard, because it was
    a 16 bit parallel wire protocol.

    So indeed the packing of byte strings and 32 bit words into 16 bits
    for exchange had to be defined, and were as follows:

    - when a word consists of two ASCII characters, the first character
    shall appear in the least significant byte

    - in a double precision 32-bit binary integer, the least significant
    word shall be transmitted first across the interface

    Why did ACR-NEMA choose to put the first character in the low byte
    of the word ? I do not know.

    > As far as I know, only INTEL and some DIGITAL Eqt. platforms natively
    > work using Little Endian. Others are mainly natively based on Big
    > Endian encoding.
    >
    > But when ACR-NEMA #1 appeared, INTEL platforms were not so popular as
    > today.
    > So why such a choice ?
    > Could some ACR-NEMA/DICOM historians help us about this ? :-))


    As DICOM was being introduced, the network transfer protocol (Part 8)
    was ready before the IODs, and in the first demonstration at RSNA,
    pre-DICOM IODs were exchanged using the new DICOM protocol. As I
    understand it, these pre-DICOM ACR-NEMA based IODs had been encoded
    in various vendor's proprietary protocols as implicit VR, and explicit
    VR had not yet been finalized.

    Indeed, ACR-NEMA had no VR's per se, or at least not the same as
    those used in DICOM. Nor did it have IODs, just lists of data
    elements that might or might not be relevant to a particular
    modality.

    Why the implicit VR default form was little rather than big endian
    I am not exactly sure.

    However, if you think about it, since ACR-NEMA specified the
    packing of the first character of a string in the least significant
    byte of the 16 bit word, for strings to be encoded in character
    order one would have to encode 16 bit ACR-NEMA as little endian,
    so perhaps this was the reason for decision.

    I.e. perhaps the decision had nothing to do with a preferred platform
    but was purely a legacy ACR-NEMA issue.

    Many (but not all) of the pre-DICOM ACR-NEMA-based file formats
    I have encountered (like SPI) seem to have been encoded on disk
    as little endian, though some are big-endian and hence the
    strings appear byte-swapped when dumped.

    I am told that before, during or after the first RSNA demonstration
    there was an effort to make the explicit VR form the default rather
    than the implicit VR form, but this was not successful, presumably
    to protect prototype or legacy implementations that were soon going
    to become product. I.e. too much effort required might have put DICOM at
    risk of not being adopted at all.

    As far as also offering big endian is concerned, as you no doubt recall,
    in those days folks had performance concerns about byte swapping bulk
    data to their native architecture. GE even went so
    far as to have its own private transfer syntax that was explicit
    VR little endian for everything except the pixel data which was
    big endian !

    David




  10. Re: Historical reason for implicit VR?

    Hi,

    The real reason had to do with the DMA transfers and 16 bit interfaces
    on ISA and Q buses. If you transmitted the first character in the
    LSB it ended up in the correct memory location when you performed a
    DMA transfer directly from the IF. Otherwise you needed a register to
    swap. Pretty stupid reason nowadays but it was considered prudent in
    1985.

    dee
    ;-D

  11. Re: Historical reason for implicit VR?

    Hi Stephen

    Sorry, but most DEC stuff were Little Endian in the 80's. I don't know
    later, however some DEC Alpha configurations were also Big Indian.

    I started my computer professional activity on a DEC-VAX machine and
    had enough trouble with this to forget it...
    I have found some interesting sites about this here:
    http://mindprod.com/jgloss/endian.html
    http://en.wikipedia.org/wiki/Little_...s_in_computers

    I don't know if the information they contains is reliable, but on an
    historical point of view, it may be quite interesting.

    Not so important in this very interesting historical reminder, I
    agree.
    Sorry :-(

    Gilles


    "Stephen J. Hart" wrote in message
    [...]
    > And as I recall the interface so specified was in fact a DEC DR11-W
    > interface, which perhaps further explains the bin-endian stuff... DEC was
    > big-endian.

    [...]

+ Reply to Thread