1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Lower Power Utilization for High End Video Card?

Discussion in 'ATI' started by W, Nov 25, 2012.

  1. W

    W Guest

    I have an older XP computer in a living room on which I installed an nVidia
    GEForce 8800 Ultra. The card performs well, but to my disbelief with the
    monitor turned off and the computer doing nothing but displaying an inactive
    Windows desktop, the nVidia card is consuming about 160 watts of energy
    continuously. Since the system is only used to run a few virtual machines
    about 99% of the time, that is a lot of wasted energy. I want a card
    that can stop burning watts when it is in a low use mode.

    Does anyone make a top tier video card that can power itself to a minimum
    power utilization mode when the card is not being used heavily? I read
    somewhere that some newer version of AMD Eyefinity could get power
    utilization in an unused mode down under 20 watts. What are details on
    that?

    --
    W
     
    W, Nov 25, 2012
    #1
    1. Advertisements

  2. W

    Paul Guest

    W wrote:
    > I have an older XP computer in a living room on which I installed an nVidia
    > GEForce 8800 Ultra. The card performs well, but to my disbelief with the
    > monitor turned off and the computer doing nothing but displaying an inactive
    > Windows desktop, the nVidia card is consuming about 160 watts of energy
    > continuously. Since the system is only used to run a few virtual machines
    > about 99% of the time, that is a lot of wasted energy. I want a card
    > that can stop burning watts when it is in a low use mode.
    >
    > Does anyone make a top tier video card that can power itself to a minimum
    > power utilization mode when the card is not being used heavily? I read
    > somewhere that some newer version of AMD Eyefinity could get power
    > utilization in an unused mode down under 20 watts. What are details on
    > that?
    >


    This is true of newer cards from either company.

    The ratio of 3D_max to Idle is improving. Your card could be 70W
    at idle (measured at the card), and newer cards have actually
    improved on that.

    Xbitlabs.com used to do per-rail power measurement, using
    a specially modified motherboard, but they've stopped doing
    that, and so we no longer have those measurements available
    for newer cards. All they do now is system power measurements,
    which are useless for determining the exact 3D_max to Idle ratio.
    (If they had a "system power with no video present" measurement,
    then, their measurements would have some value.)

    All I can tell you, is a newer card will *likely* be lower
    at idle. The 8800 is still back in the "bad" days.

    This is another one of those sites that only does system power.
    HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    is better there. Your card is around 70W idle, 131W max, which
    means ratio-wise, it doesn't do that well at idle.

    http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27

    There was an era, when silicon gates were relatively leaky.
    Intel Prescott was an example of that, where 25% of DC power
    was just wasted as heat, and did nothing for you. While chips
    still leak, more development work has gone into making
    structures for gates, which don't leak quite as bad as that.
    (The geometry of the gates shrunk, and the gates and silicon
    structures had to be redesigned to prevent leakage from them
    rising worse than the Prescott era.) The other improvement
    comes from clock gating - where desktop cards are now closer
    to how mobile graphics work, in terms of clock gating.

    There's a good chance, that no matter what card you buy,
    it'll do better than your 70W idle 8800 family card.

    Paul
     
    Paul, Nov 25, 2012
    #2
    1. Advertisements

  3. W

    Tom Guest

    Hey Paul... do you have a web site that may collect your learned answers? I
    certainly get a lot from your explanations.

    T2

    "Paul" wrote in message news:k8tl2k$9g8$...

    W wrote:
    > I have an older XP computer in a living room on which I installed an
    > nVidia
    > GEForce 8800 Ultra. The card performs well, but to my disbelief with the
    > monitor turned off and the computer doing nothing but displaying an
    > inactive
    > Windows desktop, the nVidia card is consuming about 160 watts of energy
    > continuously. Since the system is only used to run a few virtual
    > machines
    > about 99% of the time, that is a lot of wasted energy. I want a card
    > that can stop burning watts when it is in a low use mode.
    >
    > Does anyone make a top tier video card that can power itself to a minimum
    > power utilization mode when the card is not being used heavily? I read
    > somewhere that some newer version of AMD Eyefinity could get power
    > utilization in an unused mode down under 20 watts. What are details on
    > that?
    >


    This is true of newer cards from either company.

    The ratio of 3D_max to Idle is improving. Your card could be 70W
    at idle (measured at the card), and newer cards have actually
    improved on that.

    Xbitlabs.com used to do per-rail power measurement, using
    a specially modified motherboard, but they've stopped doing
    that, and so we no longer have those measurements available
    for newer cards. All they do now is system power measurements,
    which are useless for determining the exact 3D_max to Idle ratio.
    (If they had a "system power with no video present" measurement,
    then, their measurements would have some value.)

    All I can tell you, is a newer card will *likely* be lower
    at idle. The 8800 is still back in the "bad" days.

    This is another one of those sites that only does system power.
    HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    is better there. Your card is around 70W idle, 131W max, which
    means ratio-wise, it doesn't do that well at idle.

    http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27

    There was an era, when silicon gates were relatively leaky.
    Intel Prescott was an example of that, where 25% of DC power
    was just wasted as heat, and did nothing for you. While chips
    still leak, more development work has gone into making
    structures for gates, which don't leak quite as bad as that.
    (The geometry of the gates shrunk, and the gates and silicon
    structures had to be redesigned to prevent leakage from them
    rising worse than the Prescott era.) The other improvement
    comes from clock gating - where desktop cards are now closer
    to how mobile graphics work, in terms of clock gating.

    There's a good chance, that no matter what card you buy,
    it'll do better than your 70W idle 8800 family card.

    Paul
     
    Tom, Nov 25, 2012
    #3
  4. W

    W Guest

    "Paul" <> wrote in message
    news:k8tl2k$9g8$...
    > W wrote:
    > > I have an older XP computer in a living room on which I installed an

    nVidia
    > > GEForce 8800 Ultra. The card performs well, but to my disbelief with

    the
    > > monitor turned off and the computer doing nothing but displaying an

    inactive
    > > Windows desktop, the nVidia card is consuming about 160 watts of energy
    > > continuously. Since the system is only used to run a few virtual

    machines
    > > about 99% of the time, that is a lot of wasted energy. I want a card
    > > that can stop burning watts when it is in a low use mode.
    > >
    > > Does anyone make a top tier video card that can power itself to a

    minimum
    > > power utilization mode when the card is not being used heavily? I read
    > > somewhere that some newer version of AMD Eyefinity could get power
    > > utilization in an unused mode down under 20 watts. What are details on
    > > that?
    > >

    >
    > This is true of newer cards from either company.
    >
    > The ratio of 3D_max to Idle is improving. Your card could be 70W
    > at idle (measured at the card), and newer cards have actually
    > improved on that.
    >
    > Xbitlabs.com used to do per-rail power measurement, using
    > a specially modified motherboard, but they've stopped doing
    > that, and so we no longer have those measurements available
    > for newer cards. All they do now is system power measurements,
    > which are useless for determining the exact 3D_max to Idle ratio.
    > (If they had a "system power with no video present" measurement,
    > then, their measurements would have some value.)
    >
    > All I can tell you, is a newer card will *likely* be lower
    > at idle. The 8800 is still back in the "bad" days.
    >
    > This is another one of those sites that only does system power.
    > HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    > is better there. Your card is around 70W idle, 131W max, which
    > means ratio-wise, it doesn't do that well at idle.
    >
    > http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
    >
    > There was an era, when silicon gates were relatively leaky.
    > Intel Prescott was an example of that, where 25% of DC power
    > was just wasted as heat, and did nothing for you. While chips
    > still leak, more development work has gone into making
    > structures for gates, which don't leak quite as bad as that.
    > (The geometry of the gates shrunk, and the gates and silicon
    > structures had to be redesigned to prevent leakage from them
    > rising worse than the Prescott era.) The other improvement
    > comes from clock gating - where desktop cards are now closer
    > to how mobile graphics work, in terms of clock gating.
    >
    > There's a good chance, that no matter what card you buy,
    > it'll do better than your 70W idle 8800 family card.


    It's not clear what the data you report means if it is a measurement of
    total power used by the system. You would have to subtract out the system
    power use when no video card is installed to get any kind of proxy for power
    used by the video card alone?

    This article:


    http://us.digitalversus.com/graphics-card/amd-radeon-hd-7850-1-gb-p14621/test.html

    in the section named "Power Use" is suggesting that the ATI 7850 can go into
    an idle mode that uses 3 watts. Effectively the card turns itself off:

    "Better still, the excellent ZeroCore Power feature gives a 16% reduction in
    energy consumption at idle and allows you to turn the card's fan off. For
    this, the computer has to be configured so that it switches the screen off
    after a given period of time. As soon as the screen goes on standby, the
    card is almost entirely switched off and only consumes 3 watts of power,
    bringing the overall consumption of our test computer down to 74 watts."

    On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
    power card* and when the system is in idle state. 3 watts versus 160
    watts is a huge difference?

    --
    W
     
    W, Nov 26, 2012
    #4
  5. W

    Paul Guest

    Tom wrote:
    > Hey Paul... do you have a web site that may collect your learned
    > answers? I certainly get a lot from your explanations.
    >
    > T2


    Google Groups archives the contents of the news groups.

    Most of the info I gather, is "out there". It's available
    on enthusiast sites, where occasionally someone from the
    factory might mention some of this stuff. In cases like
    with Intel redesigning their silicon, there have been
    articles in the public domain about that. (Intel took
    a lot more chances during its evolution than AMD did.
    Intel turned their transistors "upside-down" for example,
    when they redid their smaller geometry processes. AMD has
    one tenth the staff, and can't afford that level of
    research.)

    I have experience at a silicon fab, but that's back
    in the days when leakage current was precisely "zero".
    So my experience doesn't count for anything. My old fab
    is gone now, and a drug company uses the building.
    Anything silicon related has long since been thrown away.

    This is the article I was looking for earlier, but
    couldn't find it again. Some per-rail power
    measurements from 2010. Some of the cards have
    pretty low power, like the HD 5450 at 3.2 watts (idle)
    and the Geforce 210 at 3.9 watts (idle). The problem
    now, is getting an article of this quality, in the
    year 2012.

    http://www.xbitlabs.com/articles/graphics/display/gpu-power-consumption-2010_3.html#sect0

    The HD 5970 there, is 44 watts (idle) and 240.7 watts (3D_max).
    So that's like a factor of 5 between the two.
    (I don't count OCCT, as it's one of a few synthetic tests
    that I wouldn't normally run here. In fact, some graphics
    drivers have features to detect things like OCCT or Furmark,
    and detune things so the card doesn't get damaged.)

    http://www.generation-gpu.fr/UserImgs/imgs/ATi/HD5870/OCCT.jpg

    So if a person can stand the crappy performance of a low-end
    card (for gaming), their idle power is exceptionally low.
    Cards like my old 9800 Pro, might be around 35 watts by
    comparison. Your room isn't going to get very
    warm, with a 3.2 watt card.

    Paul
     
    Paul, Nov 26, 2012
    #5
  6. W

    Paul Guest

    W wrote:
    > "Paul" <> wrote in message
    > news:k8tl2k$9g8$...
    >> W wrote:
    >>> I have an older XP computer in a living room on which I installed an

    > nVidia
    >>> GEForce 8800 Ultra. The card performs well, but to my disbelief with

    > the
    >>> monitor turned off and the computer doing nothing but displaying an

    > inactive
    >>> Windows desktop, the nVidia card is consuming about 160 watts of energy
    >>> continuously. Since the system is only used to run a few virtual

    > machines
    >>> about 99% of the time, that is a lot of wasted energy. I want a card
    >>> that can stop burning watts when it is in a low use mode.
    >>>
    >>> Does anyone make a top tier video card that can power itself to a

    > minimum
    >>> power utilization mode when the card is not being used heavily? I read
    >>> somewhere that some newer version of AMD Eyefinity could get power
    >>> utilization in an unused mode down under 20 watts. What are details on
    >>> that?
    >>>

    >> This is true of newer cards from either company.
    >>
    >> The ratio of 3D_max to Idle is improving. Your card could be 70W
    >> at idle (measured at the card), and newer cards have actually
    >> improved on that.
    >>
    >> Xbitlabs.com used to do per-rail power measurement, using
    >> a specially modified motherboard, but they've stopped doing
    >> that, and so we no longer have those measurements available
    >> for newer cards. All they do now is system power measurements,
    >> which are useless for determining the exact 3D_max to Idle ratio.
    >> (If they had a "system power with no video present" measurement,
    >> then, their measurements would have some value.)
    >>
    >> All I can tell you, is a newer card will *likely* be lower
    >> at idle. The 8800 is still back in the "bad" days.
    >>
    >> This is another one of those sites that only does system power.
    >> HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    >> is better there. Your card is around 70W idle, 131W max, which
    >> means ratio-wise, it doesn't do that well at idle.
    >>
    >> http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
    >>
    >> There was an era, when silicon gates were relatively leaky.
    >> Intel Prescott was an example of that, where 25% of DC power
    >> was just wasted as heat, and did nothing for you. While chips
    >> still leak, more development work has gone into making
    >> structures for gates, which don't leak quite as bad as that.
    >> (The geometry of the gates shrunk, and the gates and silicon
    >> structures had to be redesigned to prevent leakage from them
    >> rising worse than the Prescott era.) The other improvement
    >> comes from clock gating - where desktop cards are now closer
    >> to how mobile graphics work, in terms of clock gating.
    >>
    >> There's a good chance, that no matter what card you buy,
    >> it'll do better than your 70W idle 8800 family card.

    >
    > It's not clear what the data you report means if it is a measurement of
    > total power used by the system. You would have to subtract out the system
    > power use when no video card is installed to get any kind of proxy for power
    > used by the video card alone?
    >
    > This article:
    >
    >
    > http://us.digitalversus.com/graphics-card/amd-radeon-hd-7850-1-gb-p14621/test.html
    >
    > in the section named "Power Use" is suggesting that the ATI 7850 can go into
    > an idle mode that uses 3 watts. Effectively the card turns itself off:
    >
    > "Better still, the excellent ZeroCore Power feature gives a 16% reduction in
    > energy consumption at idle and allows you to turn the card's fan off. For
    > this, the computer has to be configured so that it switches the screen off
    > after a given period of time. As soon as the screen goes on standby, the
    > card is almost entirely switched off and only consumes 3 watts of power,
    > bringing the overall consumption of our test computer down to 74 watts."
    >
    > On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
    > power card* and when the system is in idle state. 3 watts versus 160
    > watts is a huge difference?
    >


    The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

    In these kinds of articles, as far as I know, the "Idle" power is with
    desktop still visible and the user has stopped pushing the mouse around.
    These are not system power numbers, these are video card only, measured with
    current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they exist).
    Xbitlabs have stopped doing it this way, because it looks like they got
    another motherboard, and aren't interested in fitting the shunts.

    http://www.xbitlabs.com/articles/graphics/display/gpu-power-consumption-2010_3.html#sect0

    The idle power of the card varies with the card's processing power in those
    charts. The HD 5970 for example, is still 44.4W for the card. A low
    end card like the HD 5450 is 3.2W idle.

    Turning off the screen is good for servers, but for a desktop
    isn't the best choice. Mainly because a desktop is more
    interactive, and if you aren't using it, chances are you've
    used S3 sleep or S4 Hibernate.

    Paul
     
    Paul, Nov 26, 2012
    #6
  7. W

    W Guest

    "Paul" <> wrote in message
    news:k8uq72$jfc$...
    > W wrote:
    > > "Paul" <> wrote in message
    > > news:k8tl2k$9g8$...
    > >> W wrote:
    > >>> I have an older XP computer in a living room on which I installed an

    > > nVidia
    > >>> GEForce 8800 Ultra. The card performs well, but to my disbelief with

    > > the
    > >>> monitor turned off and the computer doing nothing but displaying an

    > > inactive
    > >>> Windows desktop, the nVidia card is consuming about 160 watts of

    energy
    > >>> continuously. Since the system is only used to run a few virtual

    > > machines
    > >>> about 99% of the time, that is a lot of wasted energy. I want a

    card
    > >>> that can stop burning watts when it is in a low use mode.
    > >>>
    > >>> Does anyone make a top tier video card that can power itself to a

    > > minimum
    > >>> power utilization mode when the card is not being used heavily? I

    read
    > >>> somewhere that some newer version of AMD Eyefinity could get power
    > >>> utilization in an unused mode down under 20 watts. What are details

    on
    > >>> that?
    > >>>
    > >> This is true of newer cards from either company.
    > >>
    > >> The ratio of 3D_max to Idle is improving. Your card could be 70W
    > >> at idle (measured at the card), and newer cards have actually
    > >> improved on that.
    > >>
    > >> Xbitlabs.com used to do per-rail power measurement, using
    > >> a specially modified motherboard, but they've stopped doing
    > >> that, and so we no longer have those measurements available
    > >> for newer cards. All they do now is system power measurements,
    > >> which are useless for determining the exact 3D_max to Idle ratio.
    > >> (If they had a "system power with no video present" measurement,
    > >> then, their measurements would have some value.)
    > >>
    > >> All I can tell you, is a newer card will *likely* be lower
    > >> at idle. The 8800 is still back in the "bad" days.
    > >>
    > >> This is another one of those sites that only does system power.
    > >> HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    > >> is better there. Your card is around 70W idle, 131W max, which
    > >> means ratio-wise, it doesn't do that well at idle.
    > >>
    > >> http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
    > >>
    > >> There was an era, when silicon gates were relatively leaky.
    > >> Intel Prescott was an example of that, where 25% of DC power
    > >> was just wasted as heat, and did nothing for you. While chips
    > >> still leak, more development work has gone into making
    > >> structures for gates, which don't leak quite as bad as that.
    > >> (The geometry of the gates shrunk, and the gates and silicon
    > >> structures had to be redesigned to prevent leakage from them
    > >> rising worse than the Prescott era.) The other improvement
    > >> comes from clock gating - where desktop cards are now closer
    > >> to how mobile graphics work, in terms of clock gating.
    > >>
    > >> There's a good chance, that no matter what card you buy,
    > >> it'll do better than your 70W idle 8800 family card.

    > >
    > > It's not clear what the data you report means if it is a measurement of
    > > total power used by the system. You would have to subtract out the

    system
    > > power use when no video card is installed to get any kind of proxy for

    power
    > > used by the video card alone?
    > >
    > > This article:
    > >
    > >
    > >

    http://us.digitalversus.com/graphics-card/amd-radeon-hd-7850-1-gb-p14621/test.html
    > >
    > > in the section named "Power Use" is suggesting that the ATI 7850 can go

    into
    > > an idle mode that uses 3 watts. Effectively the card turns itself off:
    > >
    > > "Better still, the excellent ZeroCore Power feature gives a 16%

    reduction in
    > > energy consumption at idle and allows you to turn the card's fan off.

    For
    > > this, the computer has to be configured so that it switches the screen

    off
    > > after a given period of time. As soon as the screen goes on standby, the
    > > card is almost entirely switched off and only consumes 3 watts of power,
    > > bringing the overall consumption of our test computer down to 74 watts."
    > >
    > > On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
    > > power card* and when the system is in idle state. 3 watts versus 160
    > > watts is a huge difference?
    > >

    >
    > The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.
    >
    > In these kinds of articles, as far as I know, the "Idle" power is with
    > desktop still visible and the user has stopped pushing the mouse around.
    > These are not system power numbers, these are video card only, measured

    with
    > current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

    exist).
    > Xbitlabs have stopped doing it this way, because it looks like they got
    > another motherboard, and aren't interested in fitting the shunts.
    >
    >

    http://www.xbitlabs.com/articles/graphics/display/gpu-power-consumption-2010_3.html#sect0
    >
    > The idle power of the card varies with the card's processing power in

    those
    > charts. The HD 5970 for example, is still 44.4W for the card. A low
    > end card like the HD 5450 is 3.2W idle.


    If I believe the AMD web site, the RADEON ZeroCore power technology will
    put the video card into a sleep state that uses less than 4 watts while the
    system is running with the screen off. Good or bad, my computer will act
    as a server and the system will not sleep. But the screen will be resting
    99% of the time and during that rest time I want to minimize the power draw.

    What is the most powerful AMD video card that fully implements the ZeroCore
    technology today?


    > Turning off the screen is good for servers, but for a desktop
    > isn't the best choice. Mainly because a desktop is more
    > interactive, and if you aren't using it, chances are you've
    > used S3 sleep or S4 Hibernate.


    It is not that unusual for a computer to act like a server and run virtual
    machines in the background. In my case those run a home active directory
    and some other administrative servers.

    --
    W
     
    W, Nov 26, 2012
    #7
  8. "W" <> wrote in message
    news:...
    > "Paul" <> wrote in message
    > news:k8uq72$jfc$...
    >> W wrote:
    >> > "Paul" <> wrote in message
    >> > news:k8tl2k$9g8$...
    >> >> W wrote:
    >> >>> I have an older XP computer in a living room on which I installed an
    >> > nVidia
    >> >>> GEForce 8800 Ultra. The card performs well, but to my disbelief with
    >> > the
    >> >>> monitor turned off and the computer doing nothing but displaying an
    >> > inactive
    >> >>> Windows desktop, the nVidia card is consuming about 160 watts of

    > energy
    >> >>> continuously. Since the system is only used to run a few virtual
    >> > machines
    >> >>> about 99% of the time, that is a lot of wasted energy. I want a

    > card
    >> >>> that can stop burning watts when it is in a low use mode.
    >> >>>
    >> >>> Does anyone make a top tier video card that can power itself to a
    >> > minimum
    >> >>> power utilization mode when the card is not being used heavily? I

    > read
    >> >>> somewhere that some newer version of AMD Eyefinity could get power
    >> >>> utilization in an unused mode down under 20 watts. What are details

    > on
    >> >>> that?
    >> >>>
    >> >> This is true of newer cards from either company.
    >> >>
    >> >> The ratio of 3D_max to Idle is improving. Your card could be 70W
    >> >> at idle (measured at the card), and newer cards have actually
    >> >> improved on that.
    >> >>
    >> >> Xbitlabs.com used to do per-rail power measurement, using
    >> >> a specially modified motherboard, but they've stopped doing
    >> >> that, and so we no longer have those measurements available
    >> >> for newer cards. All they do now is system power measurements,
    >> >> which are useless for determining the exact 3D_max to Idle ratio.
    >> >> (If they had a "system power with no video present" measurement,
    >> >> then, their measurements would have some value.)
    >> >>
    >> >> All I can tell you, is a newer card will *likely* be lower
    >> >> at idle. The 8800 is still back in the "bad" days.
    >> >>
    >> >> This is another one of those sites that only does system power.
    >> >> HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    >> >> is better there. Your card is around 70W idle, 131W max, which
    >> >> means ratio-wise, it doesn't do that well at idle.
    >> >>
    >> >> http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
    >> >>
    >> >> There was an era, when silicon gates were relatively leaky.
    >> >> Intel Prescott was an example of that, where 25% of DC power
    >> >> was just wasted as heat, and did nothing for you. While chips
    >> >> still leak, more development work has gone into making
    >> >> structures for gates, which don't leak quite as bad as that.
    >> >> (The geometry of the gates shrunk, and the gates and silicon
    >> >> structures had to be redesigned to prevent leakage from them
    >> >> rising worse than the Prescott era.) The other improvement
    >> >> comes from clock gating - where desktop cards are now closer
    >> >> to how mobile graphics work, in terms of clock gating.
    >> >>
    >> >> There's a good chance, that no matter what card you buy,
    >> >> it'll do better than your 70W idle 8800 family card.
    >> >
    >> > It's not clear what the data you report means if it is a measurement of
    >> > total power used by the system. You would have to subtract out the

    > system
    >> > power use when no video card is installed to get any kind of proxy for

    > power
    >> > used by the video card alone?
    >> >
    >> > This article:
    >> >
    >> >
    >> >

    > http://us.digitalversus.com/graphics-card/amd-radeon-hd-7850-1-gb-p14621/test.html
    >> >
    >> > in the section named "Power Use" is suggesting that the ATI 7850 can go

    > into
    >> > an idle mode that uses 3 watts. Effectively the card turns itself
    >> > off:
    >> >
    >> > "Better still, the excellent ZeroCore Power feature gives a 16%

    > reduction in
    >> > energy consumption at idle and allows you to turn the card's fan off.

    > For
    >> > this, the computer has to be configured so that it switches the screen

    > off
    >> > after a given period of time. As soon as the screen goes on standby,
    >> > the
    >> > card is almost entirely switched off and only consumes 3 watts of
    >> > power,
    >> > bringing the overall consumption of our test computer down to 74
    >> > watts."
    >> >
    >> > On my system, the nVidia 8800 Ultra is consuming 160 watts *just for
    >> > the
    >> > power card* and when the system is in idle state. 3 watts versus 160
    >> > watts is a huge difference?
    >> >

    >>
    >> The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.
    >>
    >> In these kinds of articles, as far as I know, the "Idle" power is with
    >> desktop still visible and the user has stopped pushing the mouse around.
    >> These are not system power numbers, these are video card only, measured

    > with
    >> current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

    > exist).
    >> Xbitlabs have stopped doing it this way, because it looks like they got
    >> another motherboard, and aren't interested in fitting the shunts.
    >>
    >>

    > http://www.xbitlabs.com/articles/graphics/display/gpu-power-consumption-2010_3.html#sect0
    >>
    >> The idle power of the card varies with the card's processing power in

    > those
    >> charts. The HD 5970 for example, is still 44.4W for the card. A low
    >> end card like the HD 5450 is 3.2W idle.

    >
    > If I believe the AMD web site, the RADEON ZeroCore power technology will
    > put the video card into a sleep state that uses less than 4 watts while
    > the
    > system is running with the screen off. Good or bad, my computer will act
    > as a server and the system will not sleep. But the screen will be resting
    > 99% of the time and during that rest time I want to minimize the power
    > draw.
    >
    > What is the most powerful AMD video card that fully implements the
    > ZeroCore
    > technology today?
    >
    >
    >> Turning off the screen is good for servers, but for a desktop
    >> isn't the best choice. Mainly because a desktop is more
    >> interactive, and if you aren't using it, chances are you've
    >> used S3 sleep or S4 Hibernate.

    >
    > It is not that unusual for a computer to act like a server and run virtual
    > machines in the background. In my case those run a home active directory
    > and some other administrative servers.
    >
    > --
    > W
    >
    >

    On AMD's web page for the AMD Radeon HD 7970 GHz Edition:

    http://www.amd.com/us/products/desktop/graphics/7000/7970ghz/Pages/radeon-7970GHz.aspx#3

    AMD ZeroCore Power technology*
    . Ultra-low idle power when the system's display is off
    . Efficient low power mode for desktop work
    . Secondary GPUs in an AMD CrossFireT technology configuration power down
    when unneeded

    * AMD PowerPlayT, AMD PowerTune and AMD ZeroCore Power are technologies
    offered by certain AMD RadeonT products, which are designed to intelligently
    manage GPU power consumption in response to certain GPU load conditions.
    Not all products feature all technologies - check with your component or
    system manufacturer for specific model capabilities.

    It seems to be up to the add-in board partner whether or not they want to
    implement the feature.
     
    Homer Jay Simpson, Nov 26, 2012
    #8
  9. W

    W Guest

    "Homer Jay Simpson" <> wrote in message
    news:k907ua$29n$...
    > "W" <> wrote in message
    > news:...
    > > "Paul" <> wrote in message
    > > news:k8uq72$jfc$...
    > >> W wrote:
    > >> > "Paul" <> wrote in message
    > >> > news:k8tl2k$9g8$...
    > >> >> W wrote:
    > >> >>> I have an older XP computer in a living room on which I installed

    an
    > >> > nVidia
    > >> >>> GEForce 8800 Ultra. The card performs well, but to my disbelief

    with
    > >> > the
    > >> >>> monitor turned off and the computer doing nothing but displaying an
    > >> > inactive
    > >> >>> Windows desktop, the nVidia card is consuming about 160 watts of

    > > energy
    > >> >>> continuously. Since the system is only used to run a few virtual
    > >> > machines
    > >> >>> about 99% of the time, that is a lot of wasted energy. I want a

    > > card
    > >> >>> that can stop burning watts when it is in a low use mode.
    > >> >>>
    > >> >>> Does anyone make a top tier video card that can power itself to a
    > >> > minimum
    > >> >>> power utilization mode when the card is not being used heavily? I

    > > read
    > >> >>> somewhere that some newer version of AMD Eyefinity could get power
    > >> >>> utilization in an unused mode down under 20 watts. What are

    details
    > > on
    > >> >>> that?
    > >> >>>
    > >> >> This is true of newer cards from either company.
    > >> >>
    > >> >> The ratio of 3D_max to Idle is improving. Your card could be 70W
    > >> >> at idle (measured at the card), and newer cards have actually
    > >> >> improved on that.
    > >> >>
    > >> >> Xbitlabs.com used to do per-rail power measurement, using
    > >> >> a specially modified motherboard, but they've stopped doing
    > >> >> that, and so we no longer have those measurements available
    > >> >> for newer cards. All they do now is system power measurements,
    > >> >> which are useless for determining the exact 3D_max to Idle ratio.
    > >> >> (If they had a "system power with no video present" measurement,
    > >> >> then, their measurements would have some value.)
    > >> >>
    > >> >> All I can tell you, is a newer card will *likely* be lower
    > >> >> at idle. The 8800 is still back in the "bad" days.
    > >> >>
    > >> >> This is another one of those sites that only does system power.
    > >> >> HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
    > >> >> is better there. Your card is around 70W idle, 131W max, which
    > >> >> means ratio-wise, it doesn't do that well at idle.
    > >> >>
    > >> >> http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27
    > >> >>
    > >> >> There was an era, when silicon gates were relatively leaky.
    > >> >> Intel Prescott was an example of that, where 25% of DC power
    > >> >> was just wasted as heat, and did nothing for you. While chips
    > >> >> still leak, more development work has gone into making
    > >> >> structures for gates, which don't leak quite as bad as that.
    > >> >> (The geometry of the gates shrunk, and the gates and silicon
    > >> >> structures had to be redesigned to prevent leakage from them
    > >> >> rising worse than the Prescott era.) The other improvement
    > >> >> comes from clock gating - where desktop cards are now closer
    > >> >> to how mobile graphics work, in terms of clock gating.
    > >> >>
    > >> >> There's a good chance, that no matter what card you buy,
    > >> >> it'll do better than your 70W idle 8800 family card.
    > >> >
    > >> > It's not clear what the data you report means if it is a measurement

    of
    > >> > total power used by the system. You would have to subtract out the

    > > system
    > >> > power use when no video card is installed to get any kind of proxy

    for
    > > power
    > >> > used by the video card alone?
    > >> >
    > >> > This article:
    > >> >
    > >> >
    > >> >

    > >

    http://us.digitalversus.com/graphics-card/amd-radeon-hd-7850-1-gb-p14621/test.html
    > >> >
    > >> > in the section named "Power Use" is suggesting that the ATI 7850 can

    go
    > > into
    > >> > an idle mode that uses 3 watts. Effectively the card turns itself
    > >> > off:
    > >> >
    > >> > "Better still, the excellent ZeroCore Power feature gives a 16%

    > > reduction in
    > >> > energy consumption at idle and allows you to turn the card's fan off.

    > > For
    > >> > this, the computer has to be configured so that it switches the

    screen
    > > off
    > >> > after a given period of time. As soon as the screen goes on standby,
    > >> > the
    > >> > card is almost entirely switched off and only consumes 3 watts of
    > >> > power,
    > >> > bringing the overall consumption of our test computer down to 74
    > >> > watts."
    > >> >
    > >> > On my system, the nVidia 8800 Ultra is consuming 160 watts *just for
    > >> > the
    > >> > power card* and when the system is in idle state. 3 watts versus

    160
    > >> > watts is a huge difference?
    > >> >
    > >>
    > >> The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.
    > >>
    > >> In these kinds of articles, as far as I know, the "Idle" power is with
    > >> desktop still visible and the user has stopped pushing the mouse

    around.
    > >> These are not system power numbers, these are video card only, measured

    > > with
    > >> current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if

    they
    > > exist).
    > >> Xbitlabs have stopped doing it this way, because it looks like they got
    > >> another motherboard, and aren't interested in fitting the shunts.
    > >>
    > >>

    > >

    http://www.xbitlabs.com/articles/graphics/display/gpu-power-consumption-2010_3.html#sect0
    > >>
    > >> The idle power of the card varies with the card's processing power in

    > > those
    > >> charts. The HD 5970 for example, is still 44.4W for the card. A low
    > >> end card like the HD 5450 is 3.2W idle.

    > >
    > > If I believe the AMD web site, the RADEON ZeroCore power technology

    will
    > > put the video card into a sleep state that uses less than 4 watts while
    > > the
    > > system is running with the screen off. Good or bad, my computer will

    act
    > > as a server and the system will not sleep. But the screen will be

    resting
    > > 99% of the time and during that rest time I want to minimize the power
    > > draw.
    > >
    > > What is the most powerful AMD video card that fully implements the
    > > ZeroCore
    > > technology today?
    > >
    > >
    > >> Turning off the screen is good for servers, but for a desktop
    > >> isn't the best choice. Mainly because a desktop is more
    > >> interactive, and if you aren't using it, chances are you've
    > >> used S3 sleep or S4 Hibernate.

    > >
    > > It is not that unusual for a computer to act like a server and run

    virtual
    > > machines in the background. In my case those run a home active

    directory
    > > and some other administrative servers.
    > >
    > > --
    > > W
    > >
    > >

    > On AMD's web page for the AMD Radeon HD 7970 GHz Edition:
    >
    >

    http://www.amd.com/us/products/desktop/graphics/7000/7970ghz/Pages/radeon-7970GHz.aspx#3
    >
    > AMD ZeroCore Power technology*
    > . Ultra-low idle power when the system's display is off
    > . Efficient low power mode for desktop work
    > . Secondary GPUs in an AMD CrossFireT technology configuration power down
    > when unneeded
    >
    > * AMD PowerPlayT, AMD PowerTune and AMD ZeroCore Power are technologies
    > offered by certain AMD RadeonT products, which are designed to

    intelligently
    > manage GPU power consumption in response to certain GPU load conditions.
    > Not all products feature all technologies - check with your component or
    > system manufacturer for specific model capabilities.
    >
    > It seems to be up to the add-in board partner whether or not they want to
    > implement the feature.


    Right. Which leads back to my question: what is the most power AMD video
    card that fully implements the ZeroCore technology today?

    I want a card that is in the top 10% of performance and that uses under 4
    watts when video is in idle.

    --
    W
     
    W, Nov 27, 2012
    #9
  10. W

    Paul Guest

    W wrote:
    > "Homer Jay Simpson" <> wrote in message
    > news:k907ua$29n$...


    >> It seems to be up to the add-in board partner whether or not they want to
    >> implement the feature.

    >
    > Right. Which leads back to my question: what is the most power AMD video
    > card that fully implements the ZeroCore technology today?
    >
    > I want a card that is in the top 10% of performance and that uses under 4
    > watts when video is in idle.


    Most of the designs done out there, use information from a
    reference implementation. Video card designers, just don't
    run amok by themselves. They need lots of help.

    If you needed to turn off the core power, all you need is a
    core switching regulator, with a "zero volts" VID setting.
    As current video cards, send a VID code to the regulator.
    A zero setting, would be translated by the regulator, as a
    request to turn off. This was done years ago, on CPU VCore
    regulators, when the VID lines are in a floating state (goes off).
    So it would all depend, on whether the regulators used (like Volterra),
    support a feature like that. The rest of the support, comes
    from the design of the GPU itself (like, separate power planes
    for the appropriate subsystems, as it would be profitable to
    maintain some state information while in ZeroCore state - you
    need to drive the VID lines for example).

    No regular website is going to be measuring the ZeroCore condition.
    (And since Xbitlabs "got lazy", they'd have been the best technically
    equipped to do such work. But they don't have the motherboard any more.)
    I'd never heard of ZeroCore until you mentioned it. It requires the
    chip be split into pieces, such that the PCI Express portion
    remain running, while the core is powered down. (Otherwise, the
    user is going to see side-effects from hot-insertion-like behavior.)
    On a non-ZeroCore card, I would expect two regulators, one for core,
    one for memory and memory interface. Perhaps the PCI Express can
    draw power from the same one as the memory ? You'd probably want
    to maintain video card memory state (self-refresh) while in the
    ZeroCore condition, as otherwise, there's be a noticeable delay
    if the video card was flushed to system memory.

    This sounds like a question that only someone in tech support
    at ATI or Nvidia could answer, and would likely require consultation
    with engineering.

    ********

    Using ZeroCore as a search term, I can see a user having problems with it.
    And the problems are visible with the 12.10 driver (that's like a
    month ago).

    http://devforums.amd.com/game/messageview.cfm?catid=440&threadid=161791

    "I called AMD and told him about my problem. He assured me they know
    about the ZeroCore problem and have been looking into it. The first
    thing he said to try is installing the 12.11 beta drivers. If the
    problem is still occuring then he wanted me to run msconfig and choose
    Selective Startup, unchecking both the Load Services and Load Startup
    options. If ZeroCore works then it means that either a Startup Service
    or Startup Application is causing ZeroCore to fail. He gave me a
    workaround, just turn off the monitor-sleep function, since that is the
    functionality that turns ZeroCore on. If that's disabled ZeroCore
    doesn't turn on, so it will just run at 20% until the whole system goes
    into hibernate mode. It have been doing that and just turning off the
    monitor with the power button since he said that ZeroCore is activated
    when the monitor is told to go to sleep by the O.S."

    That would be selecting S1 sleep state, as far as I know.

    But at least I got a link to an article on when it was introduced.
    It confirms my basic ideas on how you'd implement it (make an
    island out of core, leave some peripheral stuff powered).

    http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/11

    HD 7970 was introduced a year ago (2011-12-22), according to this.
    You'd expect it as a feature on any ATI card more modern than that
    (unless a card is introduced using older silicon of course).

    http://www.gpureview.com/videocards.php

    The only practical way to watch ZeroCore, is with an external power
    meter. As expecting the card to answer probes while in the ZeroCore
    state, is expecting a lot. On lots of low power states on computer,
    like say C6, the mere act of probing the device, upsets the power
    state, and gives the wrong answer. It would take careful engineering
    of the ZeroCore feature, to ensure you could actually actively
    monitor the thing while it's drawing only 3 watts. Using external
    monitoring, removes all uncertainty. Hearing the fan spin,
    does *not* mean it is broken. Even at 3 watts dissipation,
    the fan might need to run occasionally. And it would be stupid
    to turn off the fan entirely, while in ZeroCore. The cooling
    system should be ready for action at any time, as temperatures
    permit. You don't want the GPU to overheat, in any circumstance.
    Hearing the fan, suggests something is still drawing power though.

    Paul
     
    Paul, Nov 27, 2012
    #10
    1. Advertisements

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Similar Threads
  1. Jannik Ebbe Brich
    Replies:
    1
    Views:
    457
  2. Alan R. Weiss
    Replies:
    0
    Views:
    593
    Alan R. Weiss
    Jun 11, 2004
  3. Replies:
    2
    Views:
    675
  4. Ant
    Replies:
    26
    Views:
    1,319
  5. Ant
    Replies:
    37
    Views:
    1,337
  6. Bresco

    High-end AVR vs. low-end ARM?

    Bresco, Nov 6, 2008, in forum: Embedded
    Replies:
    24
    Views:
    1,461
    Wilco Dijkstra
    Nov 14, 2008
  7. Bruce Varley

    High end vs low end platforms

    Bruce Varley, Mar 14, 2012, in forum: Embedded
    Replies:
    6
    Views:
    594
    Bruce Varley
    Mar 18, 2012
  8. W
    Replies:
    11
    Views:
    638
    Homer Jay Simpson
    Nov 28, 2012
Loading...