1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Why are newer cards worse than old ones?

Discussion in 'Nvidia' started by -=Matt=-, Aug 3, 2003.

  1. -=Matt=-

    -=Matt=- Guest

    In the 'old' days, when a new card came out you knew it would be better than
    the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than Geforce1
    etc.

    I don't understand how people can say, as I've noticed they do on this
    newsgroup, that some versions of the GF3 are faster than some FX's? Is it
    indeed true?!? I have also heard that the GF3 is better than the GF4mx,
    which is why I got the GF4mx on ebay - it was so cheap! I've not recieved it
    yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
    that there is a faster version of a GF2 out there than my new GF4mx???

    Why release these cards if they are inferior to existing models?
     
    -=Matt=-, Aug 3, 2003
    #1
    1. Advertisements

  2. -=Matt=-

    Jibby Guest

    why dont you use the internet and your brain and do some research on the
    different products available instead of bitchin in a newsgroup not all 2l
    engines are the same and just cause it sounds faster doesnt mean it is the
    different naming refers to the technology used not the speed of the card
     
    Jibby, Aug 3, 2003
    #2
    1. Advertisements


  3. The bottom line is money. they need it, we have it, any product however
    incrementally better gets the OEM contract and everyone makes out, except
    the enthusiast gamer.

    While the enthusiasts drive innovation and dictate mindshare, they do little
    to the bototm line of a company except jeopardize it. Also, you are painting
    with an overly broad brush, there are distinct advanatages to the newer
    cards, and raw speed in all instances isn't exactly one of them [in the
    mainstream market].

    Look at it this way, it's not that the new cards are so bad, it's that the
    older ones are so good.
     
    Derek Wildstar, Aug 4, 2003
    #3
  4. -=Matt=-

    Dave Guest

    What? Someone here doesn't remember the Voodoo Rush? Or the TNT2 M64?
    It's about maintaining the market niches using the big boy bin rejects. The
    best ones go into the flagships, the not-so-good ones into the midrange, the
    others that fail more than a pipeline here and there likely get made into
    MXes. Not like ATI doesn't do the same thing. ;-) Difference is, with
    certain ATI cards, you might get lucky and enable the other four pipelines
    without issue...
    I beg to differ. They are the ones who help push the envelope, as you say.
    This is Good for business. Helps drive the gaming industry too,
    hand-in-hand. We'll see when Doom 3 comes out which cards leave the shelves
    fastest (most of us I'd imagine are already all set. Last I looked, I
    haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come in
    the guise of a generous RMA policy, for those who cannot accept personal
    responsibility as part of a choice to tweak things...

    Also, you are painting
    Certainly not if taking the FX5200 series into question. The only things it
    really brings to the table are memory bandwidth and DX9...oh, and a swank
    PCI version ;P
    I think it's that games are coded with these older cards in mind. The GF4
    4x00's are still kicking strong. Given more high-level shader language and
    per-pixel lighting, these older cards might not look so good, eh? Literally.
    Plus, your performance depends a lot on the rest of your system. That opens
    up a whole 'nother barrel o' monkeys. It's not really the average framerate
    that matters so much as the minimum in actual gameplay. We're getting to the
    point where the video card is the bottleneck again...certainly not much
    scaling above 1280x1024 with faster CPUs and the cream-of-the-crop video
    card. Somebody release the next generation already! Enough of this milking
    of existing product cycle!
     
    Dave, Aug 4, 2003
    #4
  5. -=Matt=-

    A.C. Guest

    My suspicion is that nVidia and ATI (who is also doing the same thing with
    the 9500 and 9600) are doing it out of spite. Not sure why.
    I'm planning on getting an FX5600, despite it being occasionally
    outperformed by the much cheaper Ti4200. But I'm gambling that when DX9
    games come out my choice will be vindicated. But it probably won't. I know
    that logically I should go for a Ti4200, especially since I don't really
    care about AA, but sometimes the marketing gets you even when you know
    better...
     
    A.C., Aug 4, 2003
    #5
  6. -=Matt=-

    Chimera Guest

    I'm planning on getting an FX5600, despite it being occasionally

    By the time DX9 games come along, you'll probably have good memories of your
    FX5600, as you frag away with your new DX10 card
     
    Chimera, Aug 4, 2003
    #6
  7. I'm going to reinforce my point, that enthusiasts' aren't much for the
    bottom line. Firstly, look at the percentage of sales that the top of the
    line cards have versus the rest of the market, it's miniscule and it's not
    very profitable. In fact, the recent FX adventure was a very unprofitable
    venture. Huge losses for nvidia due to an aggressive push for the untested
    smaller fab process. How do I know? On line press and company
    communications. Only the enthusiasts cared about the FX5800, and only they
    will even know it existed, the mainstream will never think of it again after
    this post. :)

    Now, what about the lesser cards? According to dr. watson and directx diag
    reports submitted to microsoft and user submitted specs, the actual
    percentage of PC's with top tier video cards is in the low single digits.
    Only when you include the ti4200's does that percentage increase into the
    teens. Not a whole lot, when it's about 1 in 8 have a ti4200 or better. And
    that's the *savvy* user who is considered to have better hardware than
    average, so the actual numbers are likely less.

    Nvidia is a behemoth compared to ATI, and they make the bulk of their money
    from OEM sales of average-performing current hardware, not titillating the
    hard core. Sad but true. I'm not trying to play 'I have a corporate secret'
    either, most of this info has made it's way to the public domain, as it
    should, but for the most timely info, try some marketing sites like
    npdtechworld.com. While it is a pay service, is certainly has value if you
    have to make business decisions based on what's selling when.
     
    Derek Wildstar, Aug 4, 2003
    #7
  8. -=Matt=-

    Chimera Guest

    not to slug your argument or anything, but my experience is that a very low
    percentage of dr watson style dumps ever get sent to MS. It may be that the
    demographic is still quite even, but it may also be that the 'savvy' users,
    when confronted with a crash & core dump simply choose to ignore it, put it
    down to the 'Windows experience'.
     
    Chimera, Aug 4, 2003
    #8
  9. -=Matt=-

    Mark Leuck Guest

    The 9600 was a refresh to add DX-9, it also has far fewer transistors than
    the 9500 making it much cheaper to build which is why they came out with it

    As far as Nvidia if I recall the original GF1 wasn't much faster than the
    TNT Ultra, and the GF-2 was slightly slower in the beginning than the GF1.
    At least get the ultra
     
    Mark Leuck, Aug 4, 2003
    #9
  10. -=Matt=-

    Chimera Guest

    The 9600 was a refresh to add DX-9, it also has far fewer transistors than
    it

    Ive seen a lot more read into it than just that. For the record, the 9500
    was a DX9 generation card. The other real difference between the 9500 &
    9600 seems that ATI 'repositioned' their products just slightly in relation
    to each other, and made the gap between the 9600 & 9800 more of a step up.
    By giving the 9600 series 4 pixel pipes & 128-bit bus, and the 9800 series 8
    pixel pipes and 256-bit bus, they ensure that a lesser tier card will
    struggle to beat its more expensive seniors, even with aggressive
    overclocking.
     
    Chimera, Aug 4, 2003
    #10
  11. -=Matt=-

    -=Matt=- Guest

    I think that's it. Or at least that's how what I'm going to believe! I guess
    each new generation (GF3 GF4 FX etc) brings with it new cutting edge
    features like Bump mapping anti arse, DX9 or whatever, which the older cards
    don't have, but to make it a budget card, they cut back on memory and
    transistors etc. So I guess if you're playing a game which doesn't support
    these new features, then a top of the range old gen will probably beat a new
    gen!

    My TNT2u boasts detail textures and 32bit color on my PIII500, better than
    the Vd3, but I never enable them, as it runs too slow, despite looking
    better!

    -=Matt=-
     
    -=Matt=-, Aug 4, 2003
    #11
  12. -=Matt=-

    A.C. Guest

    I think both companies are using the DX-9 angle to justify making
    cards that are cheaper for them to manufacture. The problem is when
    both of the companies do it, we suffer. I mean, LESS transistors?
    That's almost insulting for some reason.
    Well, I figure the non-ultra is about $65-$100 cheaper, and it will
    play the games I'm interested in (I'm not a hard-core FPSer, and I'm
    more interested in visual effects than pure screaming speed).
     
    A.C., Aug 4, 2003
    #12
  13. -=Matt=-

    John Russell Guest

    I think the problems is that they don't design the cheapest product in a new
    family first. They produce the new flagship model and everyone agrees it's
    the faster than the best card in the old family. They then set about
    producing cheaper versions to create a new family of products. This usally
    means halfing pipelines etc. They cannot guaretee that this process will
    make the cheaper cards faster than some of the old ones.
    The simple answer is to not keep upgrading cards by buying the cheapest card
    in every family. It's better to stick to a restricted budget by waiting
    until the flagship model comes down in price. Being 6-12 months behind is no
    great loss since little software will be available initially to exploit any
    new card.
     
    John Russell, Aug 4, 2003
    #13
  14. -=Matt=-

    Dave Guest

    Hmm, originally you stated that they were a *jeopardy* to the bottom line.
    THIS is what sounds fishy. That they *aren't much* for the bottom line is a
    given. Given the miniscule ratio of enthusiasts to average joes, I would say
    it's a write-off...

    Firstly, look at the percentage of sales that the top of the
    At current pricing structures I have a little tough time believing this ;-).
    Maybe if the inventory is rotting on the shelves, perhaps...or being passed
    up in favor of a currently better solution (ATI, perhaps?)...sure, I agree
    with you completely percentage-wise...

    In fact, the recent FX adventure was a very unprofitable
    And the enthusiasts' market drove this to the brink? I'd say this can pretty
    much be put down as a Nvidia f$ckup. Poor execution.

    Huge losses for nvidia due to an aggressive push for the untested
    Pretty much old news by now, I'd imagine...some of us knew it well in
    advance of release. I certainly knew what to expect, and was scarcely
    disappointed (I had already dumped my Nvidia stock...made a bit too...). It
    made for wonderful satire while it lasted...

    Only the enthusiasts cared about the FX5800, and only they
    Thank God! It deserves to get buried, swept under the rug with the dust
    bunnies caught in its central HVAC-sized plenum, or left in the shed with
    the rest of the gardening appliances. They gambled, they lost. The way
    Nvidia handled the whole affair did the job for them on their own bottom
    line...and rightfully so. The FX5800 was a joke. No wonder they lost on it.
    They drove thee ol' wagon to market, señor, even if she were sheddin' parts
    all thee way down thee cobblestone pike an' thee ol' donkey she died when
    she got there. Now tell me, who forced Pedro into the wagon? Of course it
    was solely Nvidia's decision to continue developing, promoting and retailing
    this faux pas. Their loss was by their own hands. Think it would have been
    much different if the donkey was stillborn and they skipped an iteration of
    current tech, waiting until refinements to recycle it for another $400+
    stretch? That they would even allow such a prospect to intrude upon their
    bottom line thusly is likely the subject of several after-hours round-table
    sessions among the shareholders committee. This is where this half-baked
    theory of "Voodoo Economics" signs me right off. They almost pulled a 3dfx
    on that one. Certain other decisions they've made have hurt their bottom
    line as well. To say that the lunatic fringe market has precipitated this
    state of affairs is a little like putting Descartes before the horse...it
    conveniently factors any decision Nvidia made about how to *effectively*
    cater to this minority market right out of the picture.
    I'll accept that demographic, although as someone pointed out it's a little
    tough to tell because the savvy enthusiast might not even bother with
    submitting reports to Microsoft. Sure. Nothing new here. Most of the average
    computer shoppers will get a fair-to-middlin' OEM card, absolutely. I even
    get to replace a few of them in my travels...
    Tough to tell...not really enough information...all this really says is how
    many people with top-tier cards bother submitting reports and specs. If you
    can extrapolate from this the layout of the entire market, more power to ya!
    Myself, I'd wanna see cumulative averages of retail sales figures since gold
    date (maybe even peaks around certain game releases), inventory manifests,
    etc...I can tell you one thing: the majority of card installs I've done are
    certainly midrange hardware (that's where things could be a little confusing
    for some folks right now...). It's not the relative percentages that are in
    question here, Derek. It's the assertion that (catering to) the enthusiasts'
    market is BAD for the bottom line, lest we lose sight of the underlying
    issue in this flurry of statistics ;^). Yesterday's top shelf becomes
    tomorrow's bottom line practically annually, and it's a little hard to
    imagine that even with the attendant price drops into the range of the
    reasonable that anyone's really eatin' it here. Am I missing something?
    Not sad. Merely the way things are and have always been. But from here, the
    suggestion that covering the hardcore market is a detriment to the bottom
    line is a stretch. It's all in the execution. ATI won out on this round.
    Nvidia's "woes" we can lay right at their own front door...

    I'm not trying to play 'I have a corporate secret'
    Thank you for your kind suggestions...;-P. You know I'll rip you a new one
    if you insist on further patronization ;-)...nothing personal of course, I
    don't dislike anyone here at all, certainly not you at any rate, it's all
    in good fun!
     
    Dave, Aug 4, 2003
    #14
  15. In order to further the discussion for the folks at home:

    I'm not going to back-pedal from the jeopardy comment, I still believe that
    catering to the enthusiast is a bad financial idea in theory and in
    practice. However, while there have been notable self-destructs in the
    vidcard world, Vodoo6000, FX5800, it's arguable that external forces were
    more deterimental than the internal decisions to pursue the top-tier perhaps
    unwisely. As much as I respect the 3dfx engineers, their input into the V6
    and the FX was in fact, jeopardizing the bottom line, by pushing a product
    that required an advanced fab process for it to be a marketing and
    performing sucess. Compare it to the past practice of nvidia and their
    incremental upgrade process, three design teams working off the the advances
    of the others, *waiting* until tech caught up to them, rather than trying to
    drive tech forward. The GeForces were predictable and robust performers, all
    based on mainstream concepts.

    As far as patronization goes, I wouldn't dream of it. Only those who are
    used to that sort of thing infer that tone from a post. :)
     
    Derek Wildstar, Aug 5, 2003
    #15
  16. -=Matt=-

    Dave Guest

    Doesn't nececelery have to be. I feel that covering all strata of the market
    can still be done in profitable fashion without dumping or a loss leader.
    YMMV. This is what binning is all about. Of course, this is why we have so
    much of a mess in the midrange niche: today's mid-binned chips are not so
    much more powerful than yesterday's top dogs in current games, and the
    pricing structure does little to encourage upgrading. Consider the 5600
    series vs. the 4x00's and you'll see what I mean.

    However, while there have been notable self-destructs in the
    Very much so...and the responsibility for that choice, the implementation
    thereof, and the repercussions both good and bad still fall upon who? 'Nuff
    said?
    I think that it might have been putting Rampage on the back burner that
    drove the biggest nail into the coffin of 3dfx, that and
    mismanagement...perhaps "lack of timely input" is more to the point. But
    that is hardly their "fault" per se: somehow I suspect that it's only
    humanly possible to do just so much within a release timetable with
    available resources and interrupted focus. The magic hats were shipped
    without rabbits...
    And not necessarily the enthusiasts, n'est çe pas? Well, I suppose there's
    no substitute for design execution. That speaks for itself. Of course trying
    to spooge the adoring public with that Rush-job of a leafblower didn't win
    them ANY brownie points. But all these were hardly decisions of the
    engineering dept. so much as afterthoughts, these guys still have to answer
    to Management (sadly enough, and sometimes it seems ne'er the twain shall
    meet...) and deadlines ("Well, MAKE IT WORK, dammit! On schedule! Never mind
    how! That's YOUR job, bucko!"). I suppose the 5800U could be deemed the
    "Daikatana" of video cards...

    Compare it to the past practice of nvidia and their
    And there was not as much drive to innovate. Those days the competition
    didn't really have a leg-up...wait, there WAS no competition. I think Nvidia
    got too comfortable at the top and slipped in their game a little...now they
    want their trophy back by hook or by crook. Those 44.03 drivers are
    certainly an example of the latter...hardly the only one at that...now maybe
    being pilloried a little here and there will force Nvidia to clean up their
    act a tad? I'm gonna go out on a little limb here and suggest that the
    consumer could be the best QA of all. A product that stands head and
    shoulders above the rest (such as Nvidia's contenders in the Socket A
    chipset market) could practically sell itself on its own merits. The Voodoo
    1 certainly did; so did the V2 for awhile. As did later GeForces.
    Unfortunately, that is not the case here. If Nvidia has to "go to the
    mattresses" and rely on OEM accounts to maintain profitability in the face
    of loss on the high end, they did it to themselves with a sabot slug to the
    metatarsals, and they get zero sympathy here...as for the incremential
    upgrade process, I think we can see that pretty clearly in several midrange
    and low-end examples compared to last-gen tech in the same price range...

    We won't get into Nvidia's erstwhile relationship with M$, nor ATI's recent
    hand in M$' API development. That gets saved for a rainier day...
    Touché! Ah, a wit I can appreciate...;-)
     
    Dave, Aug 5, 2003
    #16
  17. -=Matt=-

    mmartins Guest

    I don't understand how people can say, as I've noticed they do on this
    I'm amazed by the high quality and low cost of the GF3 card

    8 player WC3 games, 64 player BF1942 sessions : with every game I try,
    performance is always up near 100fps

    And it only costs $40 on ebay...
     
    mmartins, Aug 5, 2003
    #17
  18. -=Matt=-

    Dave Guest

    Within limits, of course. I think the 5800U cooling solution was a rush-job
    at any rate. I think after a brief glimpse at the thermal output vs. scaling
    curve of the new design on said immature process, the local pharmacy might
    have run short of Tums in no time...you trying to tell me they PLANNED it
    like this from the beginning? I think they were working with a series of
    uncertainties and Murphy (the veritable patron saint of all engineers,
    technicians, spouses, and parents) sprang up, said "Gotcha!", and bit 'em
    right in the ass! From there, it went to "Oboy, better strap on the
    leafblower and blow this thing out the door with as much hype as we can
    muster so we don't look like we're slipping on product cycle" instead of
    "Let's clock this thing lower and price it accordingly so we don't look like
    a bunch of jabronis". This is where the afterthought came in. I hope this
    helps the pH of that electrolyte a little. I think the engineers may have
    been involved in such a capacity as answering the question: "Can we get away
    with it?" (of course it didn't work right when it got out the door. Among
    the horror stories, there was the one about the fan not cycling up during 3d
    screensavers and burning up the chip...oops!). As we can see, with a bit of
    process refinement, this is no longer a necessity and the reference heatsink
    is only half a flaming abortion even if still a heavy, slot-hogging
    paperweight. As things ideally should have been from the beginning.

    I'm not debating the heiracrchy of
    And this is precisely what happened. "Why" is secondary to the fact it
    happened at all. But I just won't buy into "The gamers made us do it!".
    I'll take that "perviousuly" as a Freudian slip, not a typo. Certainly lends
    an accurate twist to the whole sordid affair. ;-)
    Would have been a surer approach. So might have been waiting until process
    was more refined to release their flagship line, and riding out their
    existing market share (namely OEMs and the sub-$150 cards that comprise the
    majority of retail sales) in the meantime. Or positioning lower-clocked,
    full-featured FXes against ATI midrange cards, pricing the Ti 4x00's to
    destroy everything under ATI's 9500 series while tweaking process and
    ramping up clock speeds. Their midrange product line is 0wn3d by ATI
    performance-wise if not in overall sales (be interesting to compare %
    shipped to % sold sometime). I just couldn't see any compelling reason to
    pick a FX 5600 over a 9500 NP for the same price (especially using the 44.03
    drivers with their "disappearing 8x AA on UT" trick that are now offered as
    a Windows Update-ha!), or the 5600U when the 9700 NP is just $40 more.
    Others may prioritize brand loyalty over value, I just want what works best
    within its price range, whatever my budget may be. And I know I'm hardly
    alone.

    But we can't really say what works best, for the sake of revisionism. What
    happened happened. And a number of other factors that made it happen we are
    not entirely privy to and can only speculate with varying degrees of
    education...as far as affecting Nvidia's continued success, it may have been
    a speedbump in the road, but it hardly put them out of the running. Any way
    it would have happened, they were up to their ears in aforementioned
    electrolyte---instead of getting a bigger shovel and methodically digging
    themselves out, they've decided to fling it around a little bit...
    Which is...? (not likely able to be summed up in a few sentences, that's for
    sure!)
    And a couple of management people who were caught insider trading before
    this...

    Can't work as well when you're thinking of losing everything and
    I'll buy that for a dollar. What amazes me is that they actually had the
    cojones to release such a graceless kludge, even in such limited quantities.
    I guess they singed their reputation a little in order to try and keep the
    stockholders happy, eh? Nah, it was really those g4m3r d00dz and rabid
    Nv1d10tz they had to hold onto. Yeah, that's the ticket. Mighty white of
    'em...

    I'm hoping they get back to
    I think that development and refinement of high-end chips is fairly
    important to market longevity, don't you? Especially with stiff competition
    and at today's turnover rates. Almost like the "publish-or-perish" status
    surrounding academia. All that really happens is the ones that don't pass
    muster for high-end cards get made into lower-end cards, and the majority
    gets 'em. I'd be willing to bet one could take any given manufacturer's
    entire FX product line and find chips from the same wafer assorted
    thereabout, if that were possible to determine...
     
    Dave, Aug 5, 2003
    #18
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.