thoughts on ATi graphics in SGI UltimateVision - SGI

This is a discussion on thoughts on ATi graphics in SGI UltimateVision - SGI ; just a few random thoughts on ATi graphics cores in SGI's new Onyx4 UltimateVision subsystems. these are ment to be the ATi R300 / R350 or some R3XX core, are they not? It's well known ever since R300's introduction in ...

+ Reply to Thread
Results 1 to 2 of 2

Thread: thoughts on ATi graphics in SGI UltimateVision

  1. thoughts on ATi graphics in SGI UltimateVision

    just a few random thoughts on ATi graphics cores in SGI's new Onyx4
    UltimateVision subsystems.

    these are ment to be the ATi R300 / R350 or some R3XX core, are they not?


    It's well known ever since R300's introduction in mid 2002 that R300 can be
    used in parallel. upto 256 R300 VPUs ! that is an awesome number. I've
    tried to think what kind of graphics could be done with 256 R300 cores (or
    the newer R350 / R360 ).

    How many cores will the highest-end UltimateVision systems use? I thought
    it might be 16 or 32. Also, is that per-pipe, or what? Perhaps SGI will
    take advantage of the maximum possible of 256 in parallel, somehow.

    Certainly there will be some things that the older InfiniteReality
    subsystems did that the R3XX-based UltimateVision won't be able to do. but
    SGI is finally going for some modern hardware, with more geometry pushing
    and pixel pushing performance, as well as current shader technology. I
    cannot wait to see what can be done with R3XX's in parallel, supported by
    SGI's unmatched system / bus / memory / etc. architecture.

    Does anyone know the specific ATi core going into UltimateVision - maybe
    more than one type even, for different configurations.

    I also was wondering if CGI-level graphics could be done in real-time (i
    mean 30-60fps) on UltimateVision, or several UlimateVision systems working
    together.... I don't understand how the thing will scale, like IR and RE
    did.

    Sony attempted CGI in real-time with their GSCube, a render-box using
    Playstation2 components in parallel ( 16 EEs, 16 Graphics Synthesizer
    I-32s), with 2.5 GB of RAM,
    (2 GB main memory, 512 MB eDRAM) . they achieved real-time versions of
    several CGI films. actually some of the more simple scenes, with reduced
    complexity, and reduced image quality. it was fairly impressive

    No doubt SGI could get far greater results with their machines, and ATi R3XX
    graphics processors (that were designed by ARTX which is SGI technology
    anyway).

    With the right tools, software, APIs, etc, do you think it would be possible
    to get film quality CGI from 256 ATi R3XXs in parallel, of say Antz quality
    ? If not film grade CGI, then maybe the CGI used in 1990s television shows
    ? (re: low-end CGI)

    Even a single R300-based Radeon 9700 card was able to render a decent
    real-time version of low-budget CGI: the Animusic demo from R300s launch.
    pretty impressive, imo. other CGI-ish demos have been done on just one
    R300.

    Then of course, ATi's upcoming R420 and even more powerful R500 will be
    coming in 2004 and 2005 respectively. the mind boggles at the thought of
    what SGI could do with those monsters, in parallel. If we stretch things a
    little bit, can you see UltimateVision's sucessor in 2005-2006, using
    parallel R500s, eventually doing say 5 year old film CGI in real-time? I
    would think an array of 256 R500s *might* perhaps even a smaller number of
    cores, depending on how powerful the R500 core is. obviously it would take
    alot of work, desiging real-time CGI quality visuals. if R500 had
    embedded eDRAM, that might help. Obviously Sony will try something more
    ambitious and more serioius than GSCube, using parallel PS3 components to
    claim graphics supremacy. SGI+ATI can do better.

    Obviously SGI visualization systems are used for other purposes. everything
    that requires real-time visualization. simulators, complex scientific
    problems. military use. etc.

    thanks for reading
    ~Jeff




  2. Re: thoughts on ATi graphics in SGI UltimateVision

    Silicon Graphics mentioned that if Nvidia was a better solution in a years time,
    SGI could easily swtich over.


    "Jeff" wrote in message news:...
    > just a few random thoughts on ATi graphics cores in SGI's new Onyx4
    > UltimateVision subsystems.
    >
    > these are ment to be the ATi R300 / R350 or some R3XX core, are they not?
    >
    >
    > It's well known ever since R300's introduction in mid 2002 that R300 can be
    > used in parallel. upto 256 R300 VPUs ! that is an awesome number. I've
    > tried to think what kind of graphics could be done with 256 R300 cores (or
    > the newer R350 / R360 ).
    >
    > How many cores will the highest-end UltimateVision systems use? I thought
    > it might be 16 or 32. Also, is that per-pipe, or what? Perhaps SGI will
    > take advantage of the maximum possible of 256 in parallel, somehow.
    >
    > Certainly there will be some things that the older InfiniteReality
    > subsystems did that the R3XX-based UltimateVision won't be able to do. but
    > SGI is finally going for some modern hardware, with more geometry pushing
    > and pixel pushing performance, as well as current shader technology. I
    > cannot wait to see what can be done with R3XX's in parallel, supported by
    > SGI's unmatched system / bus / memory / etc. architecture.
    >
    > Does anyone know the specific ATi core going into UltimateVision - maybe
    > more than one type even, for different configurations.
    >
    > I also was wondering if CGI-level graphics could be done in real-time (i
    > mean 30-60fps) on UltimateVision, or several UlimateVision systems working
    > together.... I don't understand how the thing will scale, like IR and RE
    > did.
    >
    > Sony attempted CGI in real-time with their GSCube, a render-box using
    > Playstation2 components in parallel ( 16 EEs, 16 Graphics Synthesizer
    > I-32s), with 2.5 GB of RAM,
    > (2 GB main memory, 512 MB eDRAM) . they achieved real-time versions of
    > several CGI films. actually some of the more simple scenes, with reduced
    > complexity, and reduced image quality. it was fairly impressive
    >
    > No doubt SGI could get far greater results with their machines, and ATi R3XX
    > graphics processors (that were designed by ARTX which is SGI technology
    > anyway).
    >
    > With the right tools, software, APIs, etc, do you think it would be possible
    > to get film quality CGI from 256 ATi R3XXs in parallel, of say Antz quality
    > ? If not film grade CGI, then maybe the CGI used in 1990s television shows
    > ? (re: low-end CGI)
    >
    > Even a single R300-based Radeon 9700 card was able to render a decent
    > real-time version of low-budget CGI: the Animusic demo from R300s launch.
    > pretty impressive, imo. other CGI-ish demos have been done on just one
    > R300.
    >
    > Then of course, ATi's upcoming R420 and even more powerful R500 will be
    > coming in 2004 and 2005 respectively. the mind boggles at the thought of
    > what SGI could do with those monsters, in parallel. If we stretch things a
    > little bit, can you see UltimateVision's sucessor in 2005-2006, using
    > parallel R500s, eventually doing say 5 year old film CGI in real-time? I
    > would think an array of 256 R500s *might* perhaps even a smaller number of
    > cores, depending on how powerful the R500 core is. obviously it would take
    > alot of work, desiging real-time CGI quality visuals. if R500 had
    > embedded eDRAM, that might help. Obviously Sony will try something more
    > ambitious and more serioius than GSCube, using parallel PS3 components to
    > claim graphics supremacy. SGI+ATI can do better.
    >
    > Obviously SGI visualization systems are used for other purposes. everything
    > that requires real-time visualization. simulators, complex scientific
    > problems. military use. etc.
    >
    > thanks for reading
    > ~Jeff


+ Reply to Thread