1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

ATi --- Crossfire "Mark II" will finally abandon the Master-Slave implementation.

Discussion in 'Nvidia' started by John Lewis, Feb 21, 2006.

  1. John Lewis

    John Lewis Guest

    ATi is finally doing what they should have done before Crossfire was
    first shipped - integrated the compositor into the silicon of every
    high-end Crossfire-capable GPU and (a la SLI) symmetrically
    data-linking the GPUs on identical boards or modules.



    Seems not a good time to 'invest' in any of the current Crossfire
    implementations. Orphaned products with very low sales volumes
    ( the current Master/dongle cards) normally get poor long-term
    technical/software support.

    John Lewis
    John Lewis, Feb 21, 2006
    1. Advertisements

  2. John Lewis

    Walter Mitty Guest

    "John" risked the wrath of Usenet weenies mastering
    mommies computer when he ventured forth on 2006-02-21, commmitted
    his life to the whims of Google, and spluttered:
    It is a rum day when a company incurs the wrath of Mr Lewis. Like a
    dog with a bone he is.
    Walter Mitty, Feb 21, 2006
    1. Advertisements

  3. John Lewis

    John W Guest

    Not that I have anything against ATI but isn't this a NVIDIA Newsgroup?

    John W
    John W, Feb 21, 2006
  4. John Lewis

    John Lewis Guest

    Nope. Not the company. Just the asinine management and marketing.

    I'm sure that ATI engineering prototyped Crossfire in its present
    form, showed it to management and asked "please can we integrate
    the compositor into the GPUs and emulate SLI with identical boards
    and symmetrical connections" And marketing/management said
    "NO, just ship it for now, some idiots are bound to buy ! "

    ATi running after nVidia reminds me of today's version of Intel's CPU
    and chip-set groups running behind AMD as the innovator.
    Gabe's rear end would be a much easier and jucier target for my
    dog.... :) :)

    John Lewis
    John Lewis, Feb 21, 2006
  5. John Lewis

    Magnulus Guest

    Is that the biggest problem you could dig up with ATI? Because it
    sounds like a non-issue to me. It's not as if SLI is a particularly
    sensible deal, either.
    Magnulus, Feb 21, 2006
  6. John Lewis

    John Lewis Guest

    Sure is an issue if you are somebody not particularly rich but
    a PC gaming-enthusiast. pony up $1000 and 3 months later
    find that the dual-card implementation that you invested in
    had become a totally obsolete <architecture> nine months
    after its introduction. Past history of low-volume obsolete
    architectures has not been kind in terms of technical
    support and software updates.
    Agreed, but 4 million SLI motherboards, 10 million
    SLI-capable video cards, 500k-1million dual-card
    SLI gaming rigs, plus a unified driver architecture
    does have some weight in terms of long-term support.

    John Lewis
    John Lewis, Feb 22, 2006
  7. John Lewis

    First of One Guest

    For those looking for an eventual dually X1900 setup, the Crossfire master
    card, already available, can work as a standalone. Price premium is about
    $50. Finding a slave card in the future should not be problematic.
    First of One, Feb 22, 2006
  8. John Lewis

    Ed Forsythe Guest

    Sure is John but most of us cross over from time to time. Why can't we just
    all get along? <VBG> FWIW, I've been Intel all my computer life I've finally
    come to the painful realization that Intel is no longer the innovator and
    AMD is kicking but so my next box will be AMD. I switched from Matrox to ATI
    years ago when I realized that they dropped out of the gaming market. Now
    I'm going to switch to nVidia because ATI and nVidia are no longer
    leapfrogging. IMHO, nVidia has the undisputable lead and I think they will
    keep it into the foreseeable future. My brand loyalty only goes so far ;)
    Ed Forsythe, Feb 22, 2006
  9. John Lewis

    John W Guest

    For those interested in ATI video cards, they could always have a look in:


    John W
    John W, Feb 22, 2006
  10. John Lewis

    Tim O Guest

    Is there anyone that spends a fortune on new video cards that doesn't
    realize they're a sucker deal? Something twice as fast is always a
    year away.
    Tim O, Feb 22, 2006
  11. "IMHO, nVidia has the undisputable lead and I think they will keep it into
    the foreseeable future." <--- This coming from a guy who *just recently*
    conceded to AMD's technological superiority over Intel?

    This "undisputable lead" you mention doesn't exist. It's very evident that
    they are neck-and-neck competitors at this point in time. Sounds like you've
    got the same death grip on nVidia as you *had* on Intel.

    Tony DiMarzio, Feb 22, 2006
  12. John Lewis

    Tim Guest

    ... and doing so on a single card.

    Now Dell is releasing a PC with four 7800GTX GPUs. It reminds me of the
    razor companies with their number-of-blade wars. Did Gillette buy out Dell
    Tim, Feb 22, 2006
  13. John Lewis

    Folk Guest

    That's kind of how I felt after paying $400 for a 6800 GT (back in the
    day) and six months later AGP went the way of the dinosaur.

    Not quite the same issue, but painful nonetheless...
    Folk, Feb 22, 2006
  14. John Lewis

    Ed Forsythe Guest

    I'm surprised you didn't read all of my post. If anything I had a "death
    grip" on ATI. Just chalk it up to a member of a distant generation who
    suffers from brand * loyalty*. I always dance with the lady I take to the
    dance.;-) Just a friendly discussion. Don't let my preferences and foibles
    spoil the day <S>.
    Ed Forsythe, Feb 22, 2006
  15. Hmmm... don't you know you're not supposed to be agreeable on Usenet?

    Tony DiMarzio, Feb 22, 2006
  16. John Lewis

    Magnulus Guest

    Yes, it is nuts. I've got a 7800 GT (not the GTX) that runs just about
    any game out there just fine with ungodly amounts of anti-aliasing (more
    than I really need) at 1280x1024. Why would I even need two of them, let
    alone four? And certain things, like anti-aliasing of normal maps, just are
    not going to be fixed by throwing more hardware at it. Unless you've got a
    huge widescreen monitor running games at 1900x1200 or whatever, what's the

    It all comes down to software. The ultra high end graphics stuff is
    basicly dying on the PC beyond tech demoes and the occasional game. So you
    can get your super-duper SLI graphics, and find there is absolutely no
    reason to own them beyond bragging rights.
    Magnulus, Feb 22, 2006
  17. John Lewis

    Ed Forsythe Guest


    Ed Forsythe, Feb 22, 2006
  18. John Lewis

    McGrandpa Guest

    Yeah! It might let someone walk out their door with a smile on instead of a
    scowl!!! Where's the *dignity* man?!? LOL ;)

    McGrandpa, Feb 22, 2006
  19. John Lewis

    McGrandpa Guest

    *I'll* be seeing about that one friend! I can only go up to 1280x1024 as
    that's my monitors native res. All of them.

    I'm running an X2 4800+, 2 gigs PC3200 and one (01) 7800GTX 256 (for the
    HL2 running Lost Coast video stress test in 1280x1024 with everything on and
    highest, with highest image quality, 8x aa 16x af comes back consistently
    with 70.84 fps.
    FEAR, in game, 1280x968 (its highest mode I can do) with everything cranked
    up give me just in the 40's. Hmpfh.
    Doom3. Um, I didn't slow down long enough to do a demo for framerates. I
    set it to Nighmare and went for it. Half a day solid non-stop and I'm over
    halfway through! Never seen it look so good or play so fast. I have Ultra
    quality and everything is cranked up and highest. I'd guess in the 60's
    Then, I had a hd crap out on me last night. 160gb's worth of data, just
    flushed. Everything is back UP except for the bad hd. Happily just a data
    drive and most stuff is backed up already.
    Wheew.... I can't wait to see how SLI will work for some of the games! Oh,
    did I mention all this is in XP Pro 64 bit edition? :) The one game
    that's surprised me the most so far is FEAR. It does *NOT* 'give as much
    as it gets'.... meaning I think the engine code is not well optimised for
    either 32 bits mode or 64 bit mode. Because is simply does not look as good
    in-game as HL2 64 bit, Quake4, Doom3.
    I think I should see some serious improvment in fr's in all of these major
    games with two GTX 256 meg (twins! I'm having twins!) in SLI.
    McG. :eek:)))
    McGrandpa, Feb 22, 2006
  20. John Lewis

    John Lewis Guest

    Please elaborate with all details. A curious mind wants to know.

    John Lewis
    John Lewis, Feb 23, 2006
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.