1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Nvidia, ATI/AMD talk about GPU architectures for future consoles

Discussion in 'Nvidia' started by R600, Oct 13, 2007.

  1. R600

    R600 Guest

    Nvidia, ATI/AMD look beyond GPUs toward unified gaming engines

    A roundtable discussion in San Francisco this morning provided a quick
    glimpse into a very possible future for console gaming hardware: an
    evolution beyond the XBox 360 and the PlayStation-3 to a future that
    changes the entire role of graphics processing units (GPUs.) The
    discussion started with the observation that both Nvidia and ATI,
    before the latter's absorption into AMD, have been actively exploring
    general-purpose computing applications for the highly-parallel shading
    engines in their GPUs.

    Jonah Alben, vice president of GPU engineering at Nvidia, said that
    this thread began when, in response to game-developers' requests for
    more ability to differentiate, the GPU architects made the shading
    engines on their chips programmable. This not only allowed game
    developers to put their own shading algorithms-which have a
    significant impact on game appearance-on the GPU hardware, but it also
    incidentally created a very large array of somewhat-general little
    parallel processing units, each with its own local memory, ALU, and
    instruction set.

    It didn't take too long for developers in other applications to latch
    onto that fact. Today, applications developers have programmed GPUs to
    analyze financial instruments, to reduce geological data, and to do
    the heavy lifting in a variety of other applications. IBM fellow James
    Kahle made a similar remark about the arguably more general, if less
    parallel, IBM Cell processor. Cell-based blades are being used today
    for financial analysis, geological exploration and medical imaging, he
    said. Alben added that despite the somewhat limited instruction sets
    of the GPU shading engines, the only criterion for applications being
    ported to them seemed to be that the applications be parallelizable.

    But then the discussion turned back to the world of gaming consoles.
    Many of the intense, non-graphics tasks that go into an immersive game
    are also at least moderately parallelizable: game physics and probably
    the artificial-intelligence engines that run game sequence are
    examples. Could these tasks also be moved to the GPU, perhaps with a
    little more general-purpose tweak to the shading-engine hardware? The
    consensus was that yes, there were important opportunities there.

    This in turn brought two very interesting observations. One, from
    Cadence CTO for design systems Ted Vucurevich, was that the shading
    engines really needed 64-bit datapaths to exploit these opportunities.
    But since 64-bit was already being discussed simply to upgrade
    graphics rendering, this could well be within the GPU vendors'
    roadmaps. Vucurevich also pointed out, parenthetically, that Cadence
    is investigating using GPUs to do the complex calculations in the
    parallelizable codes within EDA tools.

    The other comment, by AMD vice president of engineering Robert
    Feldstein, was that the computing power of GPUs could be harnessed for
    processing graphic input, as well as for rendering. For example, he
    suggested, a camera tracking the console user could provide a video
    stream. The GPU could analyze this video to extract gestures, motion,
    and even facial expressions from the user, providing an input to the
    game system even more natural and immersive than that offered by
    controllers on the Nintendo Wii.

    The idea that the GPU, once regarded as a non-programmable fixed-
    function device could emerge as the real computing heart of the game
    system, taking major tasks away from the CPU, is fascinating. But the
    rapid spread of GPU-based computing in other areas suggests that this
    is a very plausible future for gaming SoCs. And, as we have seen
    repeatedly in the past, if the console gaming industry makes something
    inexpensive enough, architects will figure out how to use it in
    embedded systems as well.


    http://www.edn.com/blog/1690000169/post/1110015711.html
     
    R600, Oct 13, 2007
    #1
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.