1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

CPU Efficiency

Discussion in 'Intel' started by Tim, Jan 28, 2005.

  1. Tim

    Tim Guest

    If a 3GHz P4 draws 90 watts from the power supply, what percentage of that
    90 watts is actually used for processing data and not winding up in the form
    of heat? Theoretically, if a CPU were superconducting it would have 100%
    efficiency, dissipating no heat. But I'm guessing that a 3GHz P4 uses as
    much power creating heat as does crunching data. Do CPUs become less power
    efficient as their speeds increase? TIA
     
    Tim, Jan 28, 2005
    #1
    1. Advertisements

  2. Less than 1% is actually ised to crunch data, almost negligible. All of
    the power, effectively, is wasted as heat.

    DS
     
    David Schwartz, Jan 28, 2005
    #2
    1. Advertisements

  3. Tim

    Tim Guest


    Thanks for your reply. Has this always been the case? Back in the days of
    20MHz processors was 99% of the power still wasted as heat?
     
    Tim, Jan 28, 2005
    #3
  4. Tim

    DER Guest

    Yep!
     
    DER, Jan 28, 2005
    #4
  5. Yes. From a thermodynamic viewpoint, semiconductors are amazing
    inefficient, unless you are using them as resistance heaters, in which case
    they are nearly 100% efficient.

    Look at it this way, theoretically, how much energy does it take to
    perform a single bit operation? Technically it just requires the presence or
    absence of a single particle, say a photon. How much energy does it take to
    stop one photon? However, semiconductors use large numbers of bulky
    electrons. Worse, semiconductors go through intermediate states between 'on'
    and 'off' wherein they are basically heaters. This is one reason power goes
    up as clock speed increases, comparatively more time is spent in-between on
    and off.

    DS
     
    David Schwartz, Jan 28, 2005
    #5
  6. Tim

    Tim Guest

    Thanks for the explanation. Are there alternate substances available, with
    better thermodynamic properties, for making processors? If so, I would
    imagine that cost is an issue.
     
    Tim, Jan 28, 2005
    #6
  7. Nobody has made more efficient computation practical yet, with the
    possible exception of massively parallel computations using DNA, which is
    only suitable for certain types of problems (like searches). There really
    isn't much point.

    DS
     
    David Schwartz, Jan 29, 2005
    #7
  8. Tim

    Tim Guest

    I think there will eventually be a market for a more efficient alternative.
    For example, portables could have a longer battery life and batteries could
    be far less massive.
     
    Tim, Jan 29, 2005
    #8
  9. Tim

    alexi Guest

    I wonder how did you arrive to 1%? IMO, the question is incorrectly
    formulated. I would say that all 100% of power (if we neglect leakage
    for a moment) is used to process data AND winds up in the form of heat.
    All this power is used to switch states in loops of logical equations
    and memory elements.

    Now, the real question is: is this to much, or what? If you compare
    Pentium 4 with vacuum tube computers, I would say that P4 is pretty
    efficient, especially if the power is normalized per flip-flop. Until
    180nm node, there was proportional increase in efficiency of individual
    gates and flip-flops, with about zero energy required to hold logical
    states. However, further device down-scaling eventually hit the
    natural atomic-size limitations, and it becomes increasingly less
    effective to hold states - leakage kicks in even if no processing
    is performed. I am sure that a superconducting CPU will have it's
    share of problems, say it will dissipate nothing by itself, but
    might require huge electrical currents to change logical states,
    and therefore huge energy will be dissipated in power supplies.

    The other question is whether the processing in P4 is efficient
    or not in terms of how many gate transitions it takes to perform
    some logical operation. Given its highly pipelined design,
    it can be said that other architectures can do better. For example,
    if you define "efficiency" by the amount of logical operations per
    unit of time, (which is measured by ordinary benchmarks BTW), then
    Pentium-M (Dothan) looks as 3x more efficient than P4 on some classes
    of workloads.

    - aap
     
    alexi, Feb 2, 2005
    #9
  10. I guess it just depends upon terms. How is it fair to describe power
    dissipated resistively as heat as "used to switch states"? And the amount of
    energy required to switch states is part of the efficiency of the designs --
    a more efficient design would require less energy to switch states.

    The question is, what's the minimum amount of energy one could use to
    switch states? What's the minimum number of states that need to be switched?
    How does a given implementation compare to this perfect ideal.

    This is the same way the efficiency of other devices is measured.
    This issue is, what's the theoretical limit? How close does current
    technology come to that? That's how efficient it is, thermodynamically
    speaking.

    DS
     
    David Schwartz, Feb 2, 2005
    #10
  11. It **ALL** becomes heat. There's no other place for the energy to go.
    Unlike a motor or light source, there's no energy output, except the
    drive of the pins to the chipset which is balanced by the energy added
    by the chipset driving the pins of the CPU.

    Energy goes in as watts and comes out as pin drive (milliwats) and
    calories/sec of heat.
     
    Bill Davidsen, Feb 3, 2005
    #11
  12. So you're saying it is theoretically possible to perform computations
    without consuming any energy at all?

    DS
     
    David Schwartz, Feb 3, 2005
    #12
  13. Tim

    Brad Houser Guest

    If it is possible, which I doubt, that is not what Bill is saying. The point
    is that unlike a device that transforms energy into different useful forms
    (e.g. electricity to light or electricity to kinetic energy) chips convert
    electricity into heat and computations. The computations are not a form of
    energy any more than these words are.

    Brad Houser
    <not speaking for Intel>
     
    Brad Houser, Feb 4, 2005
    #13
  14. The point is, the processor decreases entropy, and it always takes
    energy to this. So it is meaningful to talk about the efficiency of a
    processor (how little energy does it use to reduce the entropy) rather than
    just saying efficiency is 0.

    DS
     
    David Schwartz, Feb 5, 2005
    #14
  15. Tim

    alexi Guest

    I am not quite sure about all this. First, processors deal with
    "informational entropy", which is no more than similar thermodynamic
    abstraction in statistics of information. Second, processors do not
    create or destroy information, they process it, so the change in
    entropy must be about zero. Third, the corresponding "informational
    energy" can't create physical work and therefore is not interfering
    with electron scattering on phonons in crystalline lattices.

    From thermodynamical point of view, a processor is an interconnected
    fabrics of bi-stable potential wells, and is very far from thermal
    equilibrium. Having a bi-stable well is a necessary requirement to
    hold bits of information, and the height of the potential barrier is
    important parameter of reliability of computation. To change
    states and perform computations, you need to deform electric field that
    forms barrier in such a way that electron avalanche occurs. The avalanche
    always consumes energy. To reduce the required energy, one need to
    reduce "effective mass" of the substance that carry the information,
    and/or reduce the barrier between states. In both cases the reliability
    of logical computations might be compromised, unless the whole
    processor is chilled to near absolute zero. Actually, I never
    thought about processors in the above highly-generalized terms,
    so the above thoughts are very unpolished and might contain few
    inaccuracies.

    - aap
     
    alexi, Feb 6, 2005
    #15
  16. Tim

    amuskratt Guest

    I couldn't help but jump in on this subject: Information has neither
    mass nor energy so, ofcourse ALL energy drawn into will have to be
    dissipated somehow. The only PHYSICAL work being done will be spinning
    the disks of a hard drive and/or a CD-ROM/DVD. The rest will come out
    as heat.

    The question REALLY is: Which cpu architecture, past or present, is
    most energy-efficient in terms of using the least amount of energy per
    cpu instruction?

    To get a handle on this, we're going to need a new benchmark.

    Now a "Watt" is a metric unit for work done per unit time which is
    equal to "Joule per second" where "Joule" is the metric unit for work.
    No radical concepts here, we're talking college freshman physics here.

    Now one of the ways, way back when, to distinguish the relative
    performance of different CPUs was to look at how many mips (millions of
    instructions per second) each is capable of doing. I say "way back
    when" is that, from what little I paid attention to this then, I
    believe there was the issue of CISC vs. RISC and such that muddied the
    waters for getting an accurate measure of TRUE computational
    performance. I'm not up-to-date on current measurement standards.

    Now, for a naive measure, by dividing mips by watts, the units of time
    would cancel out and what would we get? Millions of instruction per
    joule? The "mipj" or "mij"?

    Maybe we should look at floating-point performace? The "megaflopj"?
    Reads almost like a word that you might expect to find in an
    eastern-european language.

    This is starting to read like it may be some sort of joke but there is
    a serious hole here in our being able to understand the relative
    advantage of different cpu architectures.
     
    amuskratt, Feb 6, 2005
    #16
  17. That is not the point. Even a net change in entropy of zero has minimum
    energy costs if it passes through intermediate states of unequal entropy.

    For example, if I have a room with 100 ping pong balls evenly
    distributed in it, then I put them all in a pile, then redistribute them
    evenly, the net entropy change is zero. However, because I passed through a
    state of lower entropy, the transition cannot be accomplished on zero
    energy.

    Processors, similary, must pass through intermediate states in their
    calculations. In fact, the process of "calculating" is itself these
    intermediary states.
    I'm not sure why you think that's relevant. Even a one or zero has a
    minimal physical implementation (perhaps the presence or absence of a
    photon, perhaps something else, must there must be some minimal
    implementation). If you argue that this is zero, you get immediate
    paradoxes. For example, if no energy is required to transition a gate from
    open to closed, you immediately get that entropy can decreased. (Maxwell's
    Demon)
    This is how current processors are implemented. The question is how
    efficient current processors are. To determine this, we have to compare
    current processors to theoretically ideal processors, not theoretically
    ideal processors that use similar technology. So the question comes down to
    the minimum energy necessary to perform such calculations, which is not
    zero.

    DS
     
    David Schwartz, Feb 6, 2005
    #17
  18. Tim

    alexi Guest

    What this is supposed to mean?
    So, you are changing the point.
    First you argued that processors "decrease entropy". Now you are
    arguing that processors change its state orders up and down,
    and that's why we need energy. Which statement of yours was
    correct, given that you are so sure about your entropy approach?
    There is no paradox. The "ones" and "zeros" are mathematical, logical
    abstractions. They do not "require" any energy to deal with them.
    The physical implementation of bits is a different issue. You are mixing
    two different abstract entropies linked by unspecified "implementation".
    I asked you in the other post how did you arrive to 1% that goes to
    "pure computation", you dodged the question. To add meaning to your high
    entropy monikers, you need to compare the configuration energy of ordered
    bits in modern computers (only about 10^10 bits) with all possible
    configurations of all other atomic states in a processor. Even then you
    will have a lot of conceptual difficulties since entropy concepts are
    not quite applicable to open systems which are far from equilibrium,
    a processor is not a collection of Boltzmann randomly-colliding
    molecules.
    I already addressed this question. The "theoretically ideal" processor
    does not require any physical energy. However, any _implementation_
    of a digital processor will require energy to operate. Now it is
    up to you how do you design the implementation given the thermal
    bath around your structures and a requirement to maintain digital
    representation of information. Your "minimal implementation" will
    heavily depend on those factors.

    I am sure there must be some academic or philosophical studies on
    this purely scholastic subject. Do you have any references?

    - aap
     
    alexi, Feb 7, 2005
    #18
  19. This is your key argument and it is one I disagree with. If your
    argument was correct, we would immediately be lead to a contradiction in
    that conservation of energy would be refuted. Research any good article on
    Maxwell's Demon. It must take at least some finite amount of energy to
    switch a state from a one to a zero on any conceivable physical
    implementation. (If memory serves me, 2.4 * 10^-6 ergs.)

    DS
     
    David Schwartz, Feb 7, 2005
    #19
  20. By the way, a friend of mine who read this post emailed me to say that
    Bruce Schneier had an article or book in which he went through this
    calculation to show that there wasn't enough energy in the universe to break
    a particular encryption algorithm on any future processor. So it has
    real-world applications. ;)

    DS
     
    David Schwartz, Feb 7, 2005
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.