1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Mass Confusion about NVIDIA's G70 (GeForce 7800?) And Other Parts

Discussion in 'Nvidia' started by Guest, May 25, 2005.

  1. Guest

    Guest Guest

    Mass Confusion about NVIDIA's G70

    And Other Parts

    by Josh Walrath

    Last week some of the first good looking information on the G70 from
    NVIDIA was leaked. Now, this info pointed towards the G70 being a 110 nm
    part clocked at 430 MHz and it featured 24 pixel pipelines (six quads), and
    gave some other pertinent information. The materials leaked with the specs
    also made it look like it was legitimate. Now, I am just not so sure.

    At the J.P. Morgan technology conference, NVIDIA gave a 15 minute
    presentation with a short Q&A. Marv Burkett, CFO of NVIDIA, gave the
    presentation. Most of the presentation talked about NVIDIA's current
    financial position, how their products are positioned in the market, and how
    well certain aspects of the business are growing (GPU's and MCP's being the
    main growth areas). He also went on to state that while the Consumer
    Electroncis group (those in charge of products like the X-Box) will have
    very flat growth until around Q3, when they will start receiving income from
    the RSX (PS3 graphics unit). Their WMP (Wireless Media Products) division
    had a big customer last year, but that has since dried up. However, they
    are expecting two major customers to come on board next quarter, so that
    area should be shored up.

    In his talk he covered quite a few topics, and some of the bigger ones
    were that of the RSX and 90 nm products. Currently the RSX is still in
    development, and no actual silicon is available as of yet. Looking at
    Sony's timeline, I would expect the RSX to be taped out by the end of this
    Summer, and that first silicon will be available in late Fall. Once all the
    little problems are fixed and the design is working as it should, Sony will
    take over production and pay NVIDIA a royalty for the use of their
    technology. While overall revenue from this deal will be lower than the
    X-Box, NVIDIA will not have to worry about things such as production
    schedules, poor yields, and the other pitfalls of handling the production
    portion of a GPU. This will of course have a positive effect on net profits
    though, since this will essentially be "free money" from work previously
    done. Sony has laid out a good chunk of change for the current design work,
    and I would imagine that delivery of first silicon will be faster than I am
    quoting because Sony owns and runs the Fab that the silicon will be produced
    on (without having NVIDIA pay out the waazoo for an accelerated first run,
    you can expect Sony to give that product top priority in its Fab).

    The demos that were running at E3 were apparently mainly running on SLI
    machines, as well as G70 parts. Marv talked about how these demos were run
    on an upcoming product with many similar capabilities as the RSX chip. So,
    while the RSX will have more features that are aimed at the PS3, we can
    expect this next generation of cards to nearly match the overall performance
    and feature-set of the RSX.

    Now for the confusion. Earlier this year at a conference call with
    Jen-Hsun and the gang, it was stated that the first 90 nm parts were going
    to be introduced this Fall. Now we are hearing something different. At the
    J.P. Morgan conference, Marv Burkett clearly stated that the first 90 nm
    part will be introduced this quarter (which definitely cannot be
    characterized as "Fall"), and that all "large" products will be 90 nm from
    here on out. This suggests, in very strong language, that the G70 will be
    90 nm (as it has not been released as of yet, and it is a large part). So,
    was the leak last week legitimate? If Marv really meant what he said, then
    no, the G70 will not be a 110 nm part.

    The amount of confusion that NVIDIA has spread about their products in
    the past two years in terms of leaks has been pretty astonishing. Nobody
    has a handle on what is going to be introduced, and while the big picture is
    fairly well known, the details are not. We all know that the next gen of
    products will have a faster clockspeed, and that they will feature at least
    24 pixel pipelines. Other than that, it is a lot of guesswork. Now, one
    noted hoax that NVIDIA perpetrated was that of hinting the NV40 was a 8x2
    architecture. Apparently NVIDIA delivered "special" cards to some
    developers that showed up as 8x2, and of course this information was leaked
    to the internet community, and ATI was able to see what was going on. At
    this point ATI thought they were sitting pretty with their X800 Pro and X800
    XT PE. A 12 pixel pipeline card running at 475 MHz should just destroy a
    8x2 architected 350 MHz part. Of course the XT PE would wipe the floor with
    the competition. Then April rolled around last year and we saw that the
    NV40 was a 16 pipeline design, the 6800 GT was significantly faster than the
    X800 Pro, and the 6800 Ultra matched the X800 XT PE. As we saw, ATI had to
    introduce the X800 XT near the end of Summer of last year to be able to
    offer a part more competitive with the NVIDIA range of cards (and gave users
    something between the middling performance of the X800 Pro and the
    outstanding performance of the X800 XT PE). Unfortunately for ATI, they had
    some serious supply issues, and their XT and XT-PE parts were very hard to

    Throughout the past 5 months we have been hearing many conflicting reports
    about what the G70 will be. If Burkett is giving us a true glimpse (which I
    think he is), then we can speculate on what we can expect to see. First off
    the G70 will be 90 nm (and not the 110 nm that we were all expecting), and
    it will probably be clocked significantly higher than the 430 MHz that the
    leaked presentation documented. We can also expect a part that is around
    300 million transistors. Depending on how NVIDIA has allocated those
    transistors, I think we will see a minimum of 24 pixel pipelines. There has
    been a lot of talk about the possibility of 32 pixel pipelines, but I just
    don't know if that will happen. My conservative nature says no, but it is a
    distinct possibility that there could be essentially 32 pixel pipelines. I
    think we will also see a new multi-sampling unit that will be able to handle
    HDR content (unlike the current unit). Other things such as PureVideo will
    of course be included, and we will probably see a couple of new wrinkles.
    The "GT" version of this part could be clocked around 450 MHz, while the
    "Ultra" edition of this part will probably be 500 MHz+. Power consumption
    will still be around 6800 Ultra levels.

    With that out of the way, we can move onto the fun stuff! Now, this is
    all speculation as essentially NOTHING of the other G7x products has been
    leaked. I have a feeling that with the overall success of TSMC's 90 nm
    process (which is apparently very, very healthy) we can expect to see NVIDIA
    phasing out its NV40/41/45/48 chips. These are very large at 130 nm, and
    are not as cost effective as they once were. I feel that there is going to
    be a large turnover in the $250 to $400 range with a new set of products.
    The NV43/44/44a will continue to address the low end to the $200 market, but
    the large 130 nm NV4x parts will soon be replaced by smaller, more cost
    effective 90 nm parts. I think we will see some true competition to ATI's
    110 nm X800 series (the X800 and X800 XL). The new series of 90 nm products
    will feature the same pixel pipeline design of the G70, and all of the
    optimizations that entails. If my speculation is correct then the low end
    90 nm part will be a 12 pixel pipeline product running between 450 MHz to
    500 MHz. This will compete with the X800, and from past indications on per
    clock performance of the NV4x architecture, this should be faster than the
    X800, yet still be priced around the $249 level. The next step up will be a
    full 16 pixel pipeline design running around 500 MHz. This will compete
    with the X800 XL in price, but will of course be faster. If this product
    does in fact exist, and is sold around the $299 mark, then it could
    seriously be the best bang for the buck that we have seen since the X800 XL.
    This leaves room for one more product.

    A G7x part with 16 pixel pipelines and running at 600 MHz would exist
    at the $350 to $400 price range. This part would of course spank all of the
    current high end cards (6800 Ultra, X800/X850 XT PE), yet be offered at a
    lower price point. While this card would be very fast, it will still not be
    able to compete with the high end G70 parts priced at $450 and above. A
    massive move to 90 nm would give NVIDIA a pretty solid segmentation of
    products, and allow them to stop their 130 nm production of large parts.
    The only real question here is what will happen to the 110 nm NV42? Would
    NVIDIA be better off keeping that part and moving it down to the $200 price
    point and keep the 6200 and 6600 parts at sub-$175? Or will the 6600 GT
    still be the best part at just under $200 and phase out the NV42? My gut
    feeling is that NVIDIA will stop production on the NV42, as it honestly
    gives about the same overall performance as the cheaper 6600 GT. So, by the
    end of this summer, NVIDIA will only be producing 110 nm NV43/44/44a and the
    90 nm G7x parts.

    Again, much of this is speculation based on comments by Marv Burkett,
    as well as some other small leaks and info that is floating around. When
    ATI released their R300 in the form of the 9700 Pro, and NVIDIA was left
    sitting with the aging GeForce 4 Ti series to compete with this product and
    the NV30 had not seen the light of day, Jen-Hsun challenged his people to
    match ATI, and he essentially said, "This is war!" If NVIDIA is continuing
    with that philosophy, then we can expect to see a lot more disinformation on
    coming products, and the smoke will get amazingly thick. The only thing we
    shouldn't do is underestimate NVIDIA. It is a very aggressive company, and
    their engineering talent is seriously second to none. Hopefully ATI will
    have taken this challenge seriously, and we can expect to see some
    impressive parts from them as well. The R520 does not look to be a slouch,
    but hopefully ATI has not been lulled into complacency with the rumors that
    the G70 is a lower clocked 110 nm part.

    Guest, May 25, 2005
    1. Advertisements

  2. Guest

    Rengaw Guest

    So does all this mean I'm not going to get my money's worth?

    640k should be enough for everyone if your all crazy.
    Rengaw, May 25, 2005
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.