1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

why don't card manufacturers say how much power it uses?

Discussion in 'Nvidia' started by bruce56, Jun 30, 2013.

  1. bruce56

    bruce56 Guest

    Is it the case that the GPU model and frequency and amount of RAM
    will fix the power draw, regardless of brand name? Then I suppose
    you could trawl through data sheets and figure something out.

    One might make assumptions of maximum possible power if it is a
    slot-powered card of PCI express version 1, 2 or 3.
    And if it has no fan, then it would be more miserly.
    bruce56, Jun 30, 2013
    1. Advertisements

  2. bruce56

    Paul Guest

    Nothing really fixes the draw, but we'll get to that
    in a moment.

    For an engineer, giving power estimates can be a CLM
    (career limiting move). At my former employer, you
    couldn't even get a department located physically
    next to mine, to give out power estimates :) That's
    how close to the chest such things are held.

    I can give an example. One group, designs a chip. They
    tell our engineer, he will need a 6W power source.
    Our engineer duly complies. One day, the brand new chip
    (eng prototypes) are delivered. The chip actually
    draws 9W. The onboard power supply collapses under
    the load. There are many phone calls, labels such as
    "idiot" are exchanged, and so on. And you can imagine,
    some of the manager-to-manager phone calls, involve the
    loudest yelling. Now the program is set back, because
    the person who did the power converter, didn't plan
    for this at all (and should have).


    What things can we observe as customers: Cards
    come in classes. And the classes are an (imprecise)
    admission of power draw. So you're right, we're not
    completely in the dark. The physical design of the
    card, is an admission of the power draw.

    1) Card with no fan.
    2) Card with fan, but no PCI Express cable. (12V @ 4.3A is
    the largest slot power observed to date, for this class.
    I call that a "50W card", in nice round numbers.)
    3) Card with fan, and one PCI Express 2x3 cable.
    4) Card with fan, and two PCI Express cable connections.
    I gather at this point, we could be up around 225W.
    But since I can't afford cards like this, I hardly care :)

    So those are card classes. You could use a copy of the
    PCI Express power spec, to place a number against each
    class. The slot power, is limited. And the 4.3A number,
    is as brave as that particular engineer got. If you get
    too close to the limit, maybe a few cards will burn
    their users motherboard, and you don't want that.


    Say I'm the engineer at ATI, and I need to work out a power

    1) If I use a pathological test case, I can end up with
    a power number so high, that the number is useless to
    anyone. To the power converter designer, to the ODM,
    and so on. Nobody wants this number. In a pathological case,
    I make as many nodes toggle at once as possible. Like put an
    alternating 0x0000 0xFFFF pattern on the internal busses,
    to make them burn up. Or, run all the FP64 with dense
    instructions (wall-to-wall code). There are lots of ways
    to burn a GPU (or other chips for that matter).
    2) I may realize the GPU needs power management, such as
    active throttling, overload detection in the power
    converter, and so on. Perhaps I use something like Furmark,
    to trigger this level of current, and make sure we can
    handle it. Maybe I turn down the clock on you, if the power
    actually gets too high. The power converter design, is a
    measure of what we let you get away with.
    3) Games come in, slightly lower in power. Maybe I can set up
    a simulation test case, while the GPU is still under design,
    that models the node toggle rates seen when running a popular
    benchmark (Crysis). Then, running the power estimate software
    available to chip designers, the software will give a number
    accurate to 10% (from cell library characterization). But,
    with the large error bars surrounding the construction of
    the test case in the first place (what version of Crysis?).

    When all is said and done, lets say the number is 120W. Or
    maybe as the GPU or card designer, we set the limit to 120W,
    throttling as needed.

    Now, if I go to the Xbitlabs site, where they do actual video
    card measurements, my 120W estimated card is being measured as
    70W. And guys like me, out on the Internet, are telling people
    to plan for a 70W load. Which isn't strictly accurate. The
    120W number, might be a 3 sigma tail estimate, to cover the
    "worst case card". Maybe 30% of the cards draw 70W, but
    a few stinkers go all the way to 120W. Since Xbitlabs measures
    only a card or two in their lab, their measurement is not
    statistically significant.

    See what a mess this is ? And why nobody in their right mind
    wants to tell you the power ? :)

    OK, so I tell you the power is 70W, the card has one cable
    (third of the four classes), and I'm probably not too far off :)
    Chances are, there's enough over-estimation in the selection
    of your power supply, that you or I will never know the
    difference. Only the thermal result ("my computer case is
    too hot"), remains as a potential issue.

    Only when a user comes here, with a Shuttle with a 200W
    power supply in it, are we in serious trouble as estimators.
    For those, you *really* need to know your stuff. And the
    user has to be prepared to send a video card back, if the
    PSU shuts off :)


    Xbitlabs has stopped measuring card power numbers.

    Enjoy this last summary, as an indication of the measured
    values of a few cards.


    Note the Geforce 210 drawing 8.7W while playing Crysis.
    That's how you get away without a fan for cooling.

    (No cables, no fan, Geforce 210, gutless)

    Of course, the Geforce 210 only runs Crysis at three frames
    per second, so I guess that's no surprise that it doesn't
    use much power doing so. Selecting the antialiasing levels
    that they did, did not help matters by any stretch of the
    imagination. A person who could only afford a 210, would
    have turned off the antialiasing.


    When Xbitlabs doesn't list your card, I go here. The power
    numbers are "3-sigma crap", so don't get too carried away.


    Max Power Draw: 30.5 W <--- Ummm, OK... Sure...

    See the size of the error bars involved ? Wetting a
    finger and sticking it into the air, will do you as much good.
    If the card really used 30.5W, that fanless heatsink is
    going to be scorchingly hot.

    Paul, Jun 30, 2013
    1. Advertisements

  3. bruce56

    Robert Miles Guest

    Nvidia, at least, gives a recommended minimum power supply
    rating for graphics cards based on their GPU chips, except
    for some of the lower-end cards. Sometimes, power use ratings
    for those cards as well. However, they make them hard to find;
    you may have to read all their many web pages on those cards
    to find where they put the power ratings.


    Hint - start looking for one of the recent high-end cards
    first, get an idea of where they put that information; often
    near the end of the specifications section. For example,
    the GTX Titan card uses 250 watts for the card alone.

    I've read of some graphics cards rated at using 300 watts
    for the card alone (about the limit for many PC cases), but
    don't remember for which cards.

    A quick look over at the AMD/ATI site didn't find any
    similar power information.


    For the higher-power cards, try to get one where the card
    has a fan and a fan casing designed so that fan blows the
    card's hot air out of the computer's case.
    Robert Miles, Jul 1, 2013
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.