1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

OT; It appears that there are fewer apps that can benifit from more speed.

Discussion in 'Overclocking' started by Ken Maltby, Feb 21, 2005.

  1. Ken Maltby

    Ken Maltby Guest

    How many applications function as well as they can, and are using
    only a fraction of the available system speed? I mean can we
    expect any noticeable improvement in the performance of, say a
    wordprocessor, or Quicken? I can only type so fast and I don't
    want things to pop-up any faster in Quicken.

    Once we can do real-time video encoding with ease, what do
    we need more speed for? I know this is were the all the Geek
    equivalents of "Tim Taylor" ( More Power, Arr, Arrr, Arg!)
    hang out, but maybe there is more to be gained from other
    factors like multi-tasking/distributed processing techniques
    than just upping the CPU speed.

    Don't know if this is Trolling, or just raising a point for
    discussion.

    Luck;
    Ken
     
    Ken Maltby, Feb 21, 2005
    #1
    1. Advertisements

  2. Ken Maltby

    Larry Gagnon Guest

    Exactly! Let us be honest here - other than hard-core gaming and very
    intensive scientific modelling today's home computers are VERY fast for
    the USUAL tasks assigned to them. I would hazard a guess that 90% of home
    computer users are probably using only 5-10% of their CPU most of the
    time.

    There are also many other ways we can increase our computer speed other
    than spending money on upgrades:

    1) find and use less intensive software apps. As we all know 95% of users
    use only 10% of newer software's capabilities. This suggests we consider
    using software that does the job, rather than cluttering up disk space and
    hogging memory. For example use WordPad or AbiWord for WP rather than MS
    Word. And AVG rather than Norton for anti-virus; there are many other
    examples.

    2) learn to use keyboard shortcuts. Ctrl-S is a lot faster than taking you
    hand off the keyboard, moving mouse pointer to File, Save, click, move
    hand back onto keyboard. Sheesshhhh!

    3) start using Linux instead of M$ Windoze!

    4) ditch MS Internet Explorer and MS Outlook in favour of Firefox and
    Thunderbird in order to reduce spyware and virus infiltration.

    So yes, I agree with the gist of your post - there are many on this
    newsgroup obsessed with more computing horsepower, which may in fact, be
    unnecessary.

    Larry Gagnon, A+ certified tech.
     
    Larry Gagnon, Feb 21, 2005
    #2
    1. Advertisements

  3. My machine here (Dual MP2600) is massively overpowered for when I'm not
    rendering or running scientific stuff. For example, typing this message here
    is giving me a CPU usage of around 1%. Spell-checking in Word '97 is near
    instantaneous, and flicking between Word and OE to do spell-checking
    (spell-check in OE is broken, been broken for the past year and a bit, can't
    be bothered to fix it :) ) gets it close to 3%. In fact, the only time it
    gets above 5% is when I do stuff with Delphi or Visual Studio (which isn't
    that much of a "home" activity for most people). Even IE struggles to get
    above 5%, though I have got JavaScript/VBScript, ActiveX, Java and images
    disabled. With the extra bloat of newer versions of Office, I'm sure it'd
    break the 5% mark, but this is why I still use '97 ...

    Incidentally, the most CPU-intensive applications are P2P applications
    (local network, often ~4MB/sec), which can get it close to 30%. I'm guessing
    it's just terrible networking code, since I've written networking code that
    will happily do twice the bandwidth using about a tenth of the CPU usage.
    The other resource-sucker is personal firewalls on Windows. Tunnelling X
    though PuTTY on a Duron 1600 @ 2GHz will bring the machine to it's knees
    with all personal firewalls I've tried. Without the firewall, CPU usage sits
    at between 20% and 30%, but adding the firewall pegs it at 100% and the
    session is almost unusable (usability depending on the firewall installed,
    old versions of TPF do the best). I don't have a personal firewall on my
    duallie, as I run everything through a mobilised Duron 1600 Pebble Linux box
    (which will happily NAT at 10MB/sec at 400MHz with CPU power to spare ...).
    Guess that's what you get for moving the firewall to kernel mode :)

    Of course, when I AM running something CPU intensive, it can never go fast
    enough :)

    The other PC that I use a lot, a P3 700 running Linux, definately feels
    sluggish (slight delays draggin windows and selecting text, etc). I'm prety
    sure this is due to {a} it having a fairly ancient graphics card (Permedia 2
    based) that's either just sluggish or poorly supported and {b}having only
    256MB of RAM as opposed to the CPU, but it's definately one machine that
    could do with a bit more oomph.

    Apart from the afore mentioned problems with X tunnelling, I survived for
    two months on the Duron 1600 machine. Sure, there were times (excluding the
    rendering/scientific stuff) when I noticed that it was slower than my
    duallie (querying the MSDN library through MSVC in particular) but it wasn't
    anything that would make me want to go out and buy a new motherboard/CPU
    for. I think the usually accepted point where most "current" applications
    will run happily is around the 1GHz mark, as long as you have sufficient
    RAM.


    Yes and no. Some algorithms cannot be parallelised at all, some can only be
    done with a lot of effort. For example, you can parallelise physics
    simulations (eg: for games), but the amount of effort required is
    substantial. This is why it hasn't been done yet, and I'd guess won't be
    done for another few years. In all cases, multithreading an application
    requires a fair bit of careful planning and coding. I personally love
    multithreaded programming (especially when combined with asynchronous I/O -
    most other programmers I have met would run away screaming when faced with
    this combination), but it's a lot harder to debug. The bigger problem is
    that most developers don't have dual CPU machines (hyperthreading doesn't
    count). This means that you cannot fully test your multithreaded
    application, so most developers don't use multithreading for this reason.

    [...]
     
    Michael Brown, Feb 21, 2005
    #3
  4. Ken Maltby

    Ed Medlin Guest

    It is a good point. Most of the folks here in acho have been around a long
    time and this is nothing more than a hobby for the majority. We do have the
    OC'ing fanatic once in awhile. I have been subscribing to this group for
    over 10yrs, don't even recall exactly. Back then, we could actually make a
    world of difference in performance by a little tweaking here and there. We
    know quite well that today's increases are not as drastic, but why waste the
    extra MHz if it is there to use as long as you still have a stable system.
    So, all things considered, we just have a hobby of tweaking around and get
    the most we can get out of a given system and nothing more than that.

    Ed Medlin
     
    Ed Medlin, Feb 21, 2005
    #4
  5. Ken Maltby

    Coup Guest

    Actually we'll probably need machines roughly 3x the power of the
    high end of current desktop models for speech recognition to finally
    start becoming practical so we can finally start to end our primary
    reliance on the roughly 200 year old keyboard interface...
     
    Coup, Feb 22, 2005
    #5
  6. Ken Maltby

    Ken Maltby Guest

    I would think you could approach that with a dedicated (RISP?)
    RISC appliance. That would plug in were the keyboard does now.
    Why tie-up the CPU for a set support process? Or it might just
    mean a Honking Big APU on your sound card.

    Luck;
    Ken
     
    Ken Maltby, Feb 22, 2005
    #6
  7. Coup wrote:
    [...]
    Keyboards will probably be a major form of input for a very long time to
    come in some areas areas. For example, it's horrible to program using voice
    recognition, regardless of the quality. Try saying

    if (sscanf(ssn->recvbuf, "%d %d %[^\n]", &type, &success, ssn->msgbuf) !=
    3)
    {
    return ERROR_PROTOCOL;
    }

    with correct capitalisation and indentation ... Not to mention some of the
    far more twisted expressions that are possible in C, let alone something
    like PERL. Admittedly, augmenting games with voice recognition could be
    quite entertaining, especially for spectators :)
     
    Michael Brown, Feb 22, 2005
    #7
  8. Ken Maltby

    Zombie Wolf Guest

    Well, of course, we are going to need a LOT more speed when we get into
    virtual reality type games / training programs. And, this is the driving
    force behind it all... After all, a good virtual reality game, that was like
    being inside a movie, would make literally billions, not to mention the
    other applications for it. They are looking ahead to the big bucks to be
    gained from this...
     
    Zombie Wolf, Feb 23, 2005
    #8
  9. Ken Maltby

    Ken Maltby Guest

    Wouldn't this also be a case were most all of the processing
    that needs to be faster, is in the purview of the dedicated
    video processing? The overall system through put may be
    at issue, but there are other factors that can provide more such
    improvement than upping the CPU speed.

    Don't get me wrong here, I have the same "Need for Speed"
    as the next guy, but I'm beginning to wonder if we might be
    better off if we separated out some more processing from the
    CPU. Some of the low demand processes could be handled
    like the old separate dedicated "Arithmetical Subprocessor".
    ( Talk about giving away my age.) Or like today's integrated
    APU (Audio Processing Unit).

    These would be largely Hardware and/or Firmware
    implementations of both some of what is currently handled by
    the OS and of some sort of standardized treatment of the
    lower level interface within applications. Applications
    would be able to use a higher level interface with these
    dedicated subprocessors. They would also be able to do
    their own unique processing on the results returned from
    the subprocessor using a minimal CPU involvement.

    Just an idea;
    Ken
     
    Ken Maltby, Feb 23, 2005
    #9
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.