1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

console GPUs: ATI Xenos vs Nvidia RSX

Discussion in 'ATI' started by Guest, Jun 10, 2005.

  1. Guest

    Guest Guest

    http://www.bit-tech.net/bits/2005/06/10/richard_huddy_ati/1.html

    Interview: ATI and the Xbox 360
    Posted 10.06.2005 by Wil Harris

    In the first of a new series of interviews with industry luminaries, we're
    talking to ATI evangelist, Richard Huddy.

    Richard is responsible for talking to Developers about ATI's technology, and
    helping them to create technology that runs great on ATI hardware. Richard
    has previously lent his incredible graphics expertise to 3Dlabs and to
    NVIDIA. Most recently, he has been involved with the development and
    promotion of the graphics sub-system in the Xbox 360, designed for Microsoft
    by ATI.


    The Xbox 360 architecture

    I began by asking Richard for his opinion on the Xbox 360 archtecture. "I'm
    really impressed," he commented, "It's way better than I would have expected
    at this point in the history of 3D graphics. The unified shader architecture
    alone is capable of giving a performance increase of a factor of nearly two
    over the hardware that we have in PCs today. That's because we see many
    cases, and this is particularly true on consoles, where games are limited by
    one of the two groups of engines in the graphics chip, either the vertex
    engines or the pixel engines. With a unified pipeline we can now devote 100%
    of the hardware to which ever task is the bottleneck."

    How does he think the sharing of memory between the graphics and the main
    memory will affect performance? Well, Richard explains that the shared
    memory is "Very different" from the technology implemented on the original
    Xbox, or even on today's PC implementations.

    "The intelligent memory gives pretty awesome speed - the bandwidth is up to
    2 Terabits per second. That kind of power is almost unimaginable. The old
    terminology of 'SMA (Shared Memory Architecture)' simply doesn't do justice
    to the flexibility and power of the Xbox 360. SMA is a term we have
    inherited from the PC and it usually has some negative connotations, but the
    Xbox 360 is really nothing like that."


    ATI's Xenos Xbox chip

    By now, you'd have to have been hidden under a rock to have avoided learning
    the details of the ATI graphics that power the 360, dubbed Xenos. 10MB of
    Embedded DRAM provide enough of a buffer to enable all 360 games to have
    Anti-Aliasing switched on, effectively for no performance hit. The question
    on everyone's lips is: is this something that's going to turn up on the PC
    any time soon?

    "I'd be very surprised if these hardware features were implemented on the PC
    any time soon," we're told. "Microsoft has a very specific revision of
    DirectX (or Windows Graphics Foundation) for Xbox 360, just as they did with
    Xbox 1. DirectX for the PC includes no hardware specific instructions,
    because DirectX has to be 10 times more generic to work on a PC platform and
    the myriad of hardware configurations. I don't think it will happen. Plus
    the architecture of the Xbox 360 is closed box - that means we can do
    special things there which have no comparison in the PC space.

    "We practically have AA for free on the PC anyway right now. If the
    difference between 1280x1024 with no AA and 1280x1024 with 2x AA is 90 FPS
    and 70FPS, who wouldn't turn the AA on? The performance hit isn't going to
    be noticeable to most gamers - and with an X800 or X850 those kind of frame
    rates are common place."



    ATI's role in Xbox 360 backwards compatibility

    One of the biggest questions has been whether or not Xbox 360 would be
    backwards compatible with original Xbox games. Recent word from Microsoft
    has been that Halo 2 will be amongst the first games to get backwards
    compatibility - but until now, no one has known exactly what that means, or
    how it will be achieved.

    What are the problems? Well, Xbox 1 games are written for Intel CPUs and
    Nvidia graphics, and graphics engines in particular use hardware specific
    instructions. Apart from the change to PowerPC hardware with the Xbox 360,
    Nvidia-specific calls have to be interpreted in a manner that the ATI
    hardware in the 360 will understand.

    Richard: "Microsoft weren't focused on hardware backwards compatibility
    early on. that wasn't in the specification. They believed that any
    compatibility they could get would come in through a software layer, and
    they didn't want to compromise this generation's hardware for the sake of
    last generation's games.

    "They have implemented compatibility purely through emulation (at the CPU
    level). It looks like emulation profiles for each game are going to be
    stored on the hard drive, and I imagine that a certain number will ship with
    the system. They already have the infrastructure to distribute more profiles
    via Live, and more and more can be made available online periodically.

    "Emulating the CPU isn't really a difficult task. They have three 3GHz
    cores, so emulating one 733MHz chip is pretty easy. The real bottlenecks in
    the emulation are GPU calls - calls made specifically by games to the Nvidia
    hardware in a certain way. General GPU instructions are easy to convert - an
    instruction to draw a triangle in a certain way will be pretty generic.
    However, it's the odd cases, the proprietary routines, that will cause
    hassle."


    ATI v Nvidia: RSX, PS3 and the console wars

    With the Xbox 360 Xenos core running at 500MHz, and the PlayStation3's RSX
    graphics core running at 550MHz, the non-techie press are calling the specs
    a win for Sony. Is this really the case, though?

    Richard is adamant that the extra graphics speed on paper is more than made
    up for by the differing architecture of the Xenos. "That mere 10% clock
    speed that RSX has on Xenos is easily countered by the unified shader
    architecture that we've implemented.

    "Rather than separate pixel and vertex pipelines, we've created a single
    unified pipeline that can do both. Providing developers throw instructions
    at our architecture in the right way, Xenos can run at 100% efficiency all
    the time, rather than having some pipeline instructions waiting for others.
    For comparison, most high-end PC chips run at 50-60% typical efficiency. The
    super cool point is that 'in the right way' just means 'give us plenty of
    work to do'. The hardware manages itself."

    The issue of unified versus split shader pipelines is a critical one that we'll
    come back to in a moment, but I was curious as to how Richard felt the CPU
    architecture between the two consoles makes a difference to the graphics and
    overall power.

    "The PS3 does appear to have a huge amount of CPU power with the seven Cell
    cores. The problem they have is that CPU power isn't really what developer's
    need - the bottleneck is really the graphics. Everybody is going
    multi-threaded and multi-core - the Xbox 360 has three PowerPC cores, AMD
    and Intel both have dual-core chips, so everyone is having to learn how to
    write this stuff. But writing multi-threaded apps for two or three cores is
    difficult. Doing it for seven separate cores, when the main core has a
    slightly different feature-set from the other six, is very, very difficult."


    Unified v separate on the PC, and Nvidia's stance

    Nvidia have previously stated in public that they do not believe that
    unified shader architectures are the way forward. Windows Graphics
    Foundation 2, the version of DirectX that will ship with Longhorn, will be
    designed around the idea that the graphics card will have unified vertex and
    pixel pipelines, but will not require that to be the case. Given that ATI is
    working with Microsoft now on unified parts on next-gen DirectX, whilst
    Nvidia is saying that it doesn't think this is the best idea, does Richard
    think that Nvidia will suffer, in the long run, on the PC platform from not
    following Microsoft?

    "I'd love to say yes. I'd love to say that Nvidia are going to be stuck when
    it comes to Longhorn. But actually I do think they will have a unified
    shader architecture by the time WGF2 comes around. This time around, they
    don't have the architecture and we do, so they have to knock it and say it
    isn't worthwhile. But in the future, they'll market themselves out of this
    corner, claiming that they've cracked how to do it best. But RSX isn't
    unified, and this is why I think PS3 will almost certainly be slower and
    less powerful.

    "Talking to the guys at Microsoft, it's impossible to escape the conclusion
    that the future is for unified pipelines, there's no doubt."

    Of course, the great news for ATI is that they'll be on the second revision
    of their unified architecture by then, just as Nvidia is getting started.


    Conclusions

    So Richard has told us some really interesting stuff. His comments about
    backwards compatibility for the 360, and the architecture of the Xenos ATI
    chip have really given us some insight that was unavailable before. His
    thoughts about the comparisons between 360 and PS3 also shed some new light
    on the differences between the two consoles, and how the technology is going
    to affect PC gamers over the coming years. We'd like to say a big thanks to
    Richard for taking the time out to chat with us, and we look forward to
    bringing you some more industry insight in a couple of weeks time.
     
    Guest, Jun 10, 2005
    #1
    1. Advertisements

  2. So in summary, ATI rep claims X360 is faster than PS3.

    How is this news?
     
    a_noether_theorem, Jun 11, 2005
    #2
    1. Advertisements

  3. Guest

    Doug Guest

    No shit. How much could an ATI rep know about PS3 and Nvidia hardware? This
    is nothing but BS spin.
     
    Doug, Jun 19, 2005
    #3
  4. Guest

    msgs Guest

    The cell inside is pretty well known by now, and rsx is based on G70, which
    is also pretty well known. So there isn't really anything "new" or "hidden"
    anymore about PS3. As the guy points out, the weakest link in PS3 probably
    are the 256/256 memory and rsx, since rsx wasn't really designed for a
    console.

    PS3 wasn't even supposed to have a 3rd party GPU in the first place, the
    cell was supposed to handle that. Then Sony found out that cpu wasn't really
    that fast (it's no where as fast as the PR people would like you to
    believe), that's why nVidia was brought in at the last minute to make the
    gpu for PS3. That's why it's more a hack job than a designed-for-PS3 chip.

    Think about it, G70 and R520 are coming out very shortyly. The G70
    benchmarks are already out, it scores around 7800 in 3D Mark 2005. R520 is
    supposed to score about the same. Yet ATI has stated that R500 (the XBox 360
    gpu) is faster and more advanced than R520.

    One thing to remember also is that on 360 you get anti-aliasing for free
    (because of the 10MB embedded ram), on PS3 that will drain 20-40% of the GPU
    power. Also, as rsx is based on G70, is doesn't have an unified shader
    technology. R500 can run shader ops at 95%-100% efficiency, while rsx is
    running at 50-70%.

    Wo please, forget about Sony's pr lies (one would think that even Sony fan
    boys would have learned by now, after all the promises made with PS1 and PS2
    launces...remember "Toy Story in real time!!"), and try the facts for a a
    change.

    If, on the other hand, you _do believe_ Sony's hype, I have a few things to
    sell you also, how about my collection of unicorns (with elf riders!), brand
    new Pandora's Box and a real map to El Dorado...?
     
    msgs, Jun 19, 2005
    #4
  5. Guest

    Xen0s* Guest

    "msgs" <> ha scritto nel messaggio

    wrong.
    several rumors on web (see theinquirer.net) says that R520 will score *more*
    than 10K 3dm05, maybe 11 or 12K, and 20K with Xfired R520, we have to wait
    to know the exact score
     
    Xen0s*, Jun 19, 2005
    #5
  6. Guest

    MS Guest

    Still, according to ATI, R500 will be faster and more advanced...so what
    does that mean? If it were a PC part it woould score even higher...?
     
    MS, Jun 20, 2005
    #6
  7. Guest

    Xen0s* Guest

    yes, Xenos (R500) will be faster (imo, ATI says that xenos will be faster
    then high end video card, so both R520 and G70) and more advanced then G70
    (this is a fact)

    the G70 (aka Nvidia 7800GTX) is very disappointing, it score only 7600
    3dmark05 (as like an X850 overclocked) and 11.200 in SLI (think that a sli
    of old 6800U score 10.500)
    http://www.theinquirer.net/?article=24068
     
    Xen0s*, Jun 20, 2005
    #7
  8. Guest

    Doug Guest

    Where do you get your facts from? ATI's PR dept.? According to Tom's
    hardware the Nvidia 7800GTX is the fastest GPU now made. Do you have any
    PROOF otherwise? Or just more BS PR spin? Let's see some proof a$$hat.

    I'll bet 7800GTX's scores will only INCREASE as the drivers reach maturity
    as well. This has been the case for every new GPU architecture to date.

    The rumor mill (that you quote as fact) also has it that Nvidia already has
    a GPU faster than the 7800GTX ready to release.

    Do you have any 3dmark05 scores for the xenos? No? Then **** off.
     
    Doug, Aug 3, 2005
    #8
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.