1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Ultimate in over-the-top cell speculation. Intel manufactures Cell. Microsoft withers.

Discussion in 'Intel' started by Robert Myers, Mar 25, 2005.

  1. Robert Myers

    Robert Myers Guest

    Greetings!

    http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78

    The article quotes at length one Jim Trounson, who is part of group
    that is developing a PCI-X card for Cell, or so they say.

    Best science fiction of 2005 already awarded?

    <quote>


    Cell Industries predicts that Intel will be building Cell with
    cooperation from IBM within a year.

    Cell, software, and Microsoft's demise

    For the anticipated finale, and the end of Microsoft dominance as we
    know it, Trounson forecast that IBM will not give Microsoft hardware
    to work with, and will cash in on its support for open source and
    Linux.

    <snip>

    Cell Industries forecasts that as Intel begins producing Cell chips,
    Microsoft will try to port its operating system to the new processor.
    However, Linux will have a significant head start and Microsoft will
    in turn "fall apart."

    "When hardware is commercially available, Windows will take two to
    three years to get the first version going," Trounson said. "IBM
    already has Linux running on the Cell [at that point]."

    Adding that Cell chips will be in short supply for years, Trounson
    acknowledged that the prediction represents the unprecedented.

    "The world has never seen a step change in technology like what is
    about to occur," Trounson said.

    </quote>

    ....and then I woke up.

    RM
     
    1. Advertising

  2. Yousuf Khan

    Yousuf Khan Guest

    Re: Ultimate in over-the-top cell speculation. Intel manufacturesCell. Microsoft withers.

    Robert Myers wrote:
    > Greetings!
    >
    > http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
    >
    > The article quotes at length one Jim Trounson, who is part of group
    > that is developing a PCI-X card for Cell, or so they say.


    Crackpots can come from all industries. :)

    > Cell Industries predicts that Intel will be building Cell with
    > cooperation from IBM within a year.


    He would've been more believable if he said AMD is going to start
    building Cell, since afterall AMD and IBM have been synchronizing their
    process technologies recently. So has Chartered.

    > Cell Industries forecasts that as Intel begins producing Cell chips,
    > Microsoft will try to port its operating system to the new processor.
    > However, Linux will have a significant head start and Microsoft will
    > in turn "fall apart."


    Sort of like how Microsoft fell apart after falling two years behind
    Linux in the x86-64 arena, I guess?

    > Adding that Cell chips will be in short supply for years, Trounson
    > acknowledged that the prediction represents the unprecedented.


    I see he's already got his fallback in case his predictions inevitably
    don't come true: Cell chips will be in short supply that's why it didn't
    take off.

    > "The world has never seen a step change in technology like what is
    > about to occur," Trounson said.


    Not since, ... oh Itanium, and then later Transmeta.

    Yousuf Khan
     
    1. Advertising

  3. On Fri, 25 Mar 2005 10:28:46 -0500, Robert Myers <>
    wrote:

    >Greetings!
    >
    >http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
    >
    >The article quotes at length one Jim Trounson, who is part of group
    >that is developing a PCI-X card for Cell, or so they say.


    Uhh, does he mean a PCI-E card? Why the hell would anybody be interested
    in a PCI-X card for a future system? It would be err, good to get that bit
    right before proceeding further.

    >Best science fiction of 2005 already awarded?
    >
    ><quote>
    >
    >
    >Cell Industries predicts that Intel will be building Cell with
    >cooperation from IBM within a year.


    .... and pigs will fly! I gotta see this one.

    >Cell, software, and Microsoft's demise
    >
    >For the anticipated finale, and the end of Microsoft dominance as we
    >know it, Trounson forecast that IBM will not give Microsoft hardware
    >to work with, and will cash in on its support for open source and
    >Linux.


    B-b-b-but his *own* model is founded on open hardware specs. How could
    anybody stop M$ from getting their hands on it?

    >Cell Industries forecasts that as Intel begins producing Cell chips,
    >Microsoft will try to port its operating system to the new processor.
    >However, Linux will have a significant head start and Microsoft will
    >in turn "fall apart."
    >
    >"When hardware is commercially available, Windows will take two to
    >three years to get the first version going," Trounson said. "IBM
    >already has Linux running on the Cell [at that point]."
    >
    >Adding that Cell chips will be in short supply for years, Trounson
    >acknowledged that the prediction represents the unprecedented.
    >
    >"The world has never seen a step change in technology like what is
    >about to occur," Trounson said.
    >
    ></quote>


    One "little" flaw I see - there is talk of:

    Who is going to pay for the hardware and software for development? IBM has
    not been good at giving anything away, even to developers and certainly not
    speculatively. That was the main reason for the failure of OS/2. I've
    also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    and Alpha... all for nothing... money down the drain - we won't do that
    again. OTOH I have to confess I do not understand the open source business
    "model".<shrug>

    Off-hand, other things: 1) I don't see the XDR memory sub-system being
    amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    CPU is kinda slim... without reworking the memory interface to get to 8GB
    per CPU; 2) 32-bit FPU is not going to fly as a general purpose computer.

    >...and then I woke up.


    I hope this guy has a spare grungy garage for his efforts - seems like that
    is part of the template for success he is aiming to emulate... C.F. Dell,
    Apple, et.al.:)

    --
    Rgds, George Macdonald
     
  4. Robert Myers

    Robert Myers Guest

    On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
    <fammacd=!SPAM^> wrote:

    >On Fri, 25 Mar 2005 10:28:46 -0500, Robert Myers <>
    >wrote:
    >
    >>Greetings!
    >>
    >>http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
    >>
    >>The article quotes at length one Jim Trounson, who is part of group
    >>that is developing a PCI-X card for Cell, or so they say.

    >
    >Uhh, does he mean a PCI-E card? Why the hell would anybody be interested
    >in a PCI-X card for a future system? It would be err, good to get that bit
    >right before proceeding further.
    >

    Especially since such a card is almost certainly going to be
    I/O-bound.

    >>Best science fiction of 2005 already awarded?
    >>
    >><quote>
    >>
    >>
    >>Cell Industries predicts that Intel will be building Cell with
    >>cooperation from IBM within a year.

    >
    >... and pigs will fly! I gotta see this one.
    >
    >>Cell, software, and Microsoft's demise
    >>
    >>For the anticipated finale, and the end of Microsoft dominance as we
    >>know it, Trounson forecast that IBM will not give Microsoft hardware
    >>to work with, and will cash in on its support for open source and
    >>Linux.

    >
    >B-b-b-but his *own* model is founded on open hardware specs. How could
    >anybody stop M$ from getting their hands on it?
    >

    I guess he's assuming that M$ can't go buy a PS3 for some reason.

    >>Cell Industries forecasts that as Intel begins producing Cell chips,
    >>Microsoft will try to port its operating system to the new processor.
    >>However, Linux will have a significant head start and Microsoft will
    >>in turn "fall apart."
    >>
    >>"When hardware is commercially available, Windows will take two to
    >>three years to get the first version going," Trounson said. "IBM
    >>already has Linux running on the Cell [at that point]."
    >>
    >>Adding that Cell chips will be in short supply for years, Trounson
    >>acknowledged that the prediction represents the unprecedented.
    >>
    >>"The world has never seen a step change in technology like what is
    >>about to occur," Trounson said.
    >>
    >></quote>

    >
    >One "little" flaw I see - there is talk of:
    >
    >
    >
    >Who is going to pay for the hardware and software for development? IBM has
    >not been good at giving anything away, even to developers and certainly not
    >speculatively. That was the main reason for the failure of OS/2. I've
    >also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    >and Alpha... all for nothing... money down the drain - we won't do that
    >again. OTOH I have to confess I do not understand the open source business
    >"model".<shrug>
    >

    Umm, I guess you don't. :).

    That's why SCO is taking aim at IBM. Without IBM pumping its own
    serious money into Linux, Linux would be nowhere near where it is now,
    and IBM _is_ giving stuff away. In return, it has a nice growing
    Linux server business (and a pesky lawsuit, to be sure).

    I don't see anything wrong with the idea of IBM funding relevant
    development, but I think it very unlikely that IBM will go after
    anything that would wind up in a PC...unless, of course, IBM had
    something _really_ devious in mind in selling off its PC business.

    >Off-hand, other things: 1) I don't see the XDR memory sub-system being
    >amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    >CPU is kinda slim... without reworking the memory interface to get to 8GB
    >per CPU;


    Have you looked at the I/O bandwidth?

    http://research.scea.com/research/html/CellGDC05/07.html

    Four cell processors=2GB. Probably no more NUMA than Opteron.

    >2) 32-bit FPU is not going to fly as a general purpose computer.
    >

    SPE's can do IEEE-compliant double precision. Just ten times more
    slowly.

    >>...and then I woke up.

    >
    >I hope this guy has a spare grungy garage for his efforts - seems like that
    >is part of the template for success he is aiming to emulate... C.F. Dell,
    >Apple, et.al.:)


    I don't think Trounson is _necessarily_ wrong about how important Cell
    might be, but that clunker about PCI-X is hard to get past, never mind
    the wild speculation about Intel. Maybe he just had too much coffee
    and too little sleep and never figured anyone would be so desperate as
    to write a web article off his email.

    RM
     
  5. On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <>
    wrote:

    >On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
    ><fammacd=!SPAM^> wrote:


    >>B-b-b-but his *own* model is founded on open hardware specs. How could
    >>anybody stop M$ from getting their hands on it?
    >>

    >I guess he's assuming that M$ can't go buy a PS3 for some reason.


    You mean like they obviously couldn't go and buy Apple systems to practice
    on for XBox 2?:)

    >>One "little" flaw I see - there is talk of:
    >>
    >>
    >>
    >>Who is going to pay for the hardware and software for development? IBM has
    >>not been good at giving anything away, even to developers and certainly not
    >>speculatively. That was the main reason for the failure of OS/2. I've
    >>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    >>and Alpha... all for nothing... money down the drain - we won't do that
    >>again. OTOH I have to confess I do not understand the open source business
    >>"model".<shrug>
    >>

    >Umm, I guess you don't. :).


    No, I just I don't see how programmers are supposed to pay the rent, unless
    maybe they've been anointed by one of the self-appointed OS-gurus.

    >That's why SCO is taking aim at IBM. Without IBM pumping its own
    >serious money into Linux, Linux would be nowhere near where it is now,
    >and IBM _is_ giving stuff away. In return, it has a nice growing
    >Linux server business (and a pesky lawsuit, to be sure).
    >
    >I don't see anything wrong with the idea of IBM funding relevant
    >development, but I think it very unlikely that IBM will go after
    >anything that would wind up in a PC...unless, of course, IBM had
    >something _really_ devious in mind in selling off its PC business.


    Giving stuff away and giving it to the right people are two different
    scenarios. If you've ever been on the good end of an IBM give-way, you'll
    know that it is not a comfortable position. As for the PC, it is not going
    away any time soon, so there'd better be some vision of how Cell fits into
    that slot... Apple's second chance??:)

    >>Off-hand, other things: 1) I don't see the XDR memory sub-system being
    >>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    >>CPU is kinda slim... without reworking the memory interface to get to 8GB
    >>per CPU;

    >
    >Have you looked at the I/O bandwidth?
    >
    >http://research.scea.com/research/html/CellGDC05/07.html
    >
    >Four cell processors=2GB. Probably no more NUMA than Opteron.


    Well it would seem that the inter-CPU communications/coherency is less well
    defined for the moment and there's a *big* difference between the current
    256MB/CPU of Cell and Opteron's 16GB/CPU.

    >>2) 32-bit FPU is not going to fly as a general purpose computer.
    >>

    >SPE's can do IEEE-compliant double precision. Just ten times more
    >slowly.
    >
    >>>...and then I woke up.

    >>
    >>I hope this guy has a spare grungy garage for his efforts - seems like that
    >>is part of the template for success he is aiming to emulate... C.F. Dell,
    >>Apple, et.al.:)

    >
    >I don't think Trounson is _necessarily_ wrong about how important Cell
    >might be, but that clunker about PCI-X is hard to get past, never mind
    >the wild speculation about Intel. Maybe he just had too much coffee
    >and too little sleep and never figured anyone would be so desperate as
    >to write a web article off his email.


    It's hard to fathom what *might* be sitting in a lab right now or what NDAs
    might be in place, but as it stands, it appears that there's a lot fo work
    to do to bring it into use in a general purpose computer.

    --
    Rgds, George Macdonald
     
  6. Robert Myers

    Robert Myers Guest

    On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
    <fammacd=!SPAM^> wrote:

    >On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <>
    >wrote:
    >
    >>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
    >><fammacd=!SPAM^> wrote:

    >


    <snip>

    >
    >>>One "little" flaw I see - there is talk of:
    >>>
    >>>
    >>>
    >>>Who is going to pay for the hardware and software for development? IBM has
    >>>not been good at giving anything away, even to developers and certainly not
    >>>speculatively. That was the main reason for the failure of OS/2. I've
    >>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    >>>and Alpha... all for nothing... money down the drain - we won't do that
    >>>again. OTOH I have to confess I do not understand the open source business
    >>>"model".<shrug>
    >>>

    >>Umm, I guess you don't. :).

    >
    >No, I just I don't see how programmers are supposed to pay the rent, unless
    >maybe they've been anointed by one of the self-appointed OS-gurus.
    >

    This is a big subject, and I won't insult you by taking a weak flyer
    at it. The google

    economics "open source"

    would be a good start.

    >>That's why SCO is taking aim at IBM. Without IBM pumping its own
    >>serious money into Linux, Linux would be nowhere near where it is now,
    >>and IBM _is_ giving stuff away. In return, it has a nice growing
    >>Linux server business (and a pesky lawsuit, to be sure).
    >>
    >>I don't see anything wrong with the idea of IBM funding relevant
    >>development, but I think it very unlikely that IBM will go after
    >>anything that would wind up in a PC...unless, of course, IBM had
    >>something _really_ devious in mind in selling off its PC business.

    >
    >Giving stuff away and giving it to the right people are two different
    >scenarios. If you've ever been on the good end of an IBM give-way, you'll
    >know that it is not a comfortable position. As for the PC, it is not going
    >away any time soon, so there'd better be some vision of how Cell fits into
    >that slot... Apple's second chance??:)
    >

    The more I look at Cell, the more I am convinced I don't understand
    how it will be used. Or rather, I can imagine ways in which it can be
    used, but I'm not sure those are those only ways. The more I look at
    the architecture, the more I like it, and I see lots of possibilities.

    It's easiest to imagine the SPE's processing a bunch of content or
    doing number crunching as a stream processor, but I can also imagine
    using all those SPE's to overcome the natural limitations of the
    in-order PPC: Spin off a task speculatively (or on less than perfect
    information), execute in local memory, and commit only when whatever
    predicate conditions are satisfied (or throw the result away).

    The SPE's can also be isolated (I think) from the world of everyday
    interrupts, and I think that might offer some serious advantages for
    the processor.

    But the question, of course, is not, are there interesting things one
    might try, but will any of those things actually be made to work and
    what do you get as a payoff. It seems reasonably certain you could
    make Cell function as a PC processor if you wanted to. The question
    is: why would you want to?

    David Wang is worried about the software model. That doesn't worry me
    so much. The fact that Sony is in such turmoil and has never been
    able to make the "profit is in the content" model really pay off for
    them (and, as far as I can tell, only Apple, in a field of many
    entrants, has succeeded at that game). A weakened and distracted Sony
    with a sagging stock price and turmoil at the top is going to turn
    aside one of the biggest tidal waves in the history of technology
    (x86)?

    >>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
    >>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    >>>CPU is kinda slim... without reworking the memory interface to get to 8GB
    >>>per CPU;

    >>
    >>Have you looked at the I/O bandwidth?
    >>
    >>http://research.scea.com/research/html/CellGDC05/07.html
    >>
    >>Four cell processors=2GB. Probably no more NUMA than Opteron.

    >
    >Well it would seem that the inter-CPU communications/coherency is less well
    >defined for the moment and there's a *big* difference between the current
    >256MB/CPU of Cell and Opteron's 16GB/CPU.
    >

    Maybe an issue if you want to use it for in-memory databases or a
    server, but not so much so for computationally-intensive work.

    >>>2) 32-bit FPU is not going to fly as a general purpose computer.
    >>>

    >>SPE's can do IEEE-compliant double precision. Just ten times more
    >>slowly.
    >>
    >>>>...and then I woke up.
    >>>
    >>>I hope this guy has a spare grungy garage for his efforts - seems like that
    >>>is part of the template for success he is aiming to emulate... C.F. Dell,
    >>>Apple, et.al.:)

    >>

    And I think he's got the wrong product. If IBM isn't working on a
    Blue Gene style card already, I'll be amazed.

    >>I don't think Trounson is _necessarily_ wrong about how important Cell
    >>might be, but that clunker about PCI-X is hard to get past, never mind
    >>the wild speculation about Intel. Maybe he just had too much coffee
    >>and too little sleep and never figured anyone would be so desperate as
    >>to write a web article off his email.

    >
    >It's hard to fathom what *might* be sitting in a lab right now or what NDAs
    >might be in place, but as it stands, it appears that there's a lot fo work
    >to do to bring it into use in a general purpose computer.


    The "front-end" is a PowerPC. Multi-threaded and in-order, but a
    PowerPC, nevertheless. The compiler exists. I'll bet there is even
    significant experience getting it to work with DSP coprocessors.

    There is always the cautionary tale of itanium (which could wind up
    looking more than a little bit like Cell). Intel was much better
    positioned than Sony, it had much greater resources, and how far has
    it gotten?

    RM
     
  7. "Robert Myers" <> wrote in message
    news:...
    >
    > Spin off a task speculatively (or on less than perfect
    > information), execute in local memory, and commit only when whatever
    > predicate conditions are satisfied (or throw the result away).


    Wow! Hand-tuned assembly language whose carefully crafted results get
    thrown out. That looks like a very efficient way to develop modern
    software! ;-)
     
  8. Robert Myers

    Robert Myers Guest

    On Sat, 26 Mar 2005 13:58:14 GMT, "Felger Carbon" <>
    wrote:

    >"Robert Myers" <> wrote in message
    >news:...
    >>
    >> Spin off a task speculatively (or on less than perfect
    >> information), execute in local memory, and commit only when whatever
    >> predicate conditions are satisfied (or throw the result away).

    >
    >Wow! Hand-tuned assembly language whose carefully crafted results get
    >thrown out. That looks like a very efficient way to develop modern
    >software! ;-)
    >

    I wasn't expecting it to be produced as hand-tuned assembly. You
    forget my involvement with Itanium. Everything will be possible with
    a compiler...one day.

    Itanium compilers are already a fair bit of the way down this road.
    You identify a task you can't be sure is safe because of data
    amiguity. You set a predicate condition, execute the task, and check
    the predicate.

    With multiple execution units sitting on a bus connected to the CPU,
    you don't have to wring your hands so much over the costs of spinning
    off an execution path without full information. It should be no
    harder than itanium predicated execution and maybe much easier.

    RM
     
  9. On Sat, 26 Mar 2005 07:29:33 -0500, Robert Myers <>
    wrote:

    >On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
    ><fammacd=!SPAM^> wrote:
    >
    >>On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <>
    >>wrote:
    >>
    >>>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
    >>><fammacd=!SPAM^> wrote:

    >>

    >
    ><snip>
    >
    >>
    >>>>One "little" flaw I see - there is talk of:
    >>>>
    >>>>
    >>>>
    >>>>Who is going to pay for the hardware and software for development? IBM has
    >>>>not been good at giving anything away, even to developers and certainly not
    >>>>speculatively. That was the main reason for the failure of OS/2. I've
    >>>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    >>>>and Alpha... all for nothing... money down the drain - we won't do that
    >>>>again. OTOH I have to confess I do not understand the open source business
    >>>>"model".<shrug>
    >>>>
    >>>Umm, I guess you don't. :).

    >>
    >>No, I just I don't see how programmers are supposed to pay the rent, unless
    >>maybe they've been anointed by one of the self-appointed OS-gurus.
    >>

    >This is a big subject, and I won't insult you by taking a weak flyer
    >at it. The google
    >
    >economics "open source"
    >
    >would be a good start.


    Oh I've already read a bit on it and it just doesn't make sense to me. One
    case in point: an often mentioned OS factoid has "geeks" playing in their
    "spare time" to create software; if, as usually presented, they are also
    professional programmers "during the day", they are very likely breaking
    their employment agreement. Add in the fact that many (most) professional
    programmers work *some* overtime for their employers and often at odd
    hours, under pressure, the whole concept of OS is a fantasy to me.

    >>>That's why SCO is taking aim at IBM. Without IBM pumping its own
    >>>serious money into Linux, Linux would be nowhere near where it is now,
    >>>and IBM _is_ giving stuff away. In return, it has a nice growing
    >>>Linux server business (and a pesky lawsuit, to be sure).
    >>>
    >>>I don't see anything wrong with the idea of IBM funding relevant
    >>>development, but I think it very unlikely that IBM will go after
    >>>anything that would wind up in a PC...unless, of course, IBM had
    >>>something _really_ devious in mind in selling off its PC business.

    >>
    >>Giving stuff away and giving it to the right people are two different
    >>scenarios. If you've ever been on the good end of an IBM give-way, you'll
    >>know that it is not a comfortable position. As for the PC, it is not going
    >>away any time soon, so there'd better be some vision of how Cell fits into
    >>that slot... Apple's second chance??:)
    >>

    >The more I look at Cell, the more I am convinced I don't understand
    >how it will be used. Or rather, I can imagine ways in which it can be
    >used, but I'm not sure those are those only ways. The more I look at
    >the architecture, the more I like it, and I see lots of possibilities.
    >
    >It's easiest to imagine the SPE's processing a bunch of content or
    >doing number crunching as a stream processor, but I can also imagine
    >using all those SPE's to overcome the natural limitations of the
    >in-order PPC: Spin off a task speculatively (or on less than perfect
    >information), execute in local memory, and commit only when whatever
    >predicate conditions are satisfied (or throw the result away).
    >
    >The SPE's can also be isolated (I think) from the world of everyday
    >interrupts, and I think that might offer some serious advantages for
    >the processor.
    >
    >But the question, of course, is not, are there interesting things one
    >might try, but will any of those things actually be made to work and
    >what do you get as a payoff. It seems reasonably certain you could
    >make Cell function as a PC processor if you wanted to. The question
    >is: why would you want to?


    So my question is: what else (useful) will you do with it?... make ASPs out
    of it? I don't think so - even IT can't make its politics work there. If
    you can build game boxes and super computers with it, why not PCs? As
    Apple's next (or next/next) CPU it may not be that far fetched - obviously
    they already have the PPC part down.

    >David Wang is worried about the software model. That doesn't worry me
    >so much. The fact that Sony is in such turmoil and has never been
    >able to make the "profit is in the content" model really pay off for
    >them (and, as far as I can tell, only Apple, in a field of many
    >entrants, has succeeded at that game). A weakened and distracted Sony
    >with a sagging stock price and turmoil at the top is going to turn
    >aside one of the biggest tidal waves in the history of technology
    >(x86)?


    I agree with David - the software environment is necessarily horribly
    complex and AFAICT at this stage, needs programmers of a calibre which is
    not commonly found... near genius even. Mr. Trounson's runtime compiler is
    a *very* old idea, which has had no takers till now.

    >>>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
    >>>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    >>>>CPU is kinda slim... without reworking the memory interface to get to 8GB
    >>>>per CPU;
    >>>
    >>>Have you looked at the I/O bandwidth?
    >>>
    >>>http://research.scea.com/research/html/CellGDC05/07.html
    >>>
    >>>Four cell processors=2GB. Probably no more NUMA than Opteron.

    >>
    >>Well it would seem that the inter-CPU communications/coherency is less well
    >>defined for the moment and there's a *big* difference between the current
    >>256MB/CPU of Cell and Opteron's 16GB/CPU.
    >>

    >Maybe an issue if you want to use it for in-memory databases or a
    >server, but not so much so for computationally-intensive work.


    They're not even in the same ballpark. We already hear talk of (PC) game
    developers raving about the >4GB address space of x86-64 and what they're
    going to do with it; I guess Sony is not anticipating doing similar things
    for PS3 players??

    >>>I don't think Trounson is _necessarily_ wrong about how important Cell
    >>>might be, but that clunker about PCI-X is hard to get past, never mind
    >>>the wild speculation about Intel. Maybe he just had too much coffee
    >>>and too little sleep and never figured anyone would be so desperate as
    >>>to write a web article off his email.

    >>
    >>It's hard to fathom what *might* be sitting in a lab right now or what NDAs
    >>might be in place, but as it stands, it appears that there's a lot fo work
    >>to do to bring it into use in a general purpose computer.

    >
    >The "front-end" is a PowerPC. Multi-threaded and in-order, but a
    >PowerPC, nevertheless. The compiler exists. I'll bet there is even
    >significant experience getting it to work with DSP coprocessors.


    It still looks like a steep slope to me... starting with the memory
    interface. Dave has outlined how to do it, to get to 4GB with 512Mb chips,
    but until it's actually done we don't really know.

    >There is always the cautionary tale of itanium (which could wind up
    >looking more than a little bit like Cell). Intel was much better
    >positioned than Sony, it had much greater resources, and how far has
    >it gotten?


    So you're not tempted to have a little flutter on RMBS? The pump 'n'
    dumpers seem to have gone cold on it with the Infineon deal - are they not
    paying attention?:)

    --
    Rgds, George Macdonald
     
  10. Robert Myers

    Robert Myers Guest

    On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
    <fammacd=!SPAM^> wrote:

    >On Sat, 26 Mar 2005 07:29:33 -0500, Robert Myers <>
    >wrote:
    >
    >>On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
    >><fammacd=!SPAM^> wrote:
    >>
    >>>On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <>
    >>>wrote:
    >>>
    >>>>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
    >>>><fammacd=!SPAM^> wrote:
    >>>

    >>
    >><snip>
    >>
    >>>
    >>>>>One "little" flaw I see - there is talk of:
    >>>>>
    >>>>>
    >>>>>
    >>>>>Who is going to pay for the hardware and software for development? IBM has
    >>>>>not been good at giving anything away, even to developers and certainly not
    >>>>>speculatively. That was the main reason for the failure of OS/2. I've
    >>>>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
    >>>>>and Alpha... all for nothing... money down the drain - we won't do that
    >>>>>again. OTOH I have to confess I do not understand the open source business
    >>>>>"model".<shrug>
    >>>>>
    >>>>Umm, I guess you don't. :).
    >>>
    >>>No, I just I don't see how programmers are supposed to pay the rent, unless
    >>>maybe they've been anointed by one of the self-appointed OS-gurus.
    >>>

    >>This is a big subject, and I won't insult you by taking a weak flyer
    >>at it. The google
    >>
    >>economics "open source"
    >>
    >>would be a good start.

    >
    >Oh I've already read a bit on it and it just doesn't make sense to me. One
    >case in point: an often mentioned OS factoid has "geeks" playing in their
    >"spare time" to create software; if, as usually presented, they are also
    >professional programmers "during the day", they are very likely breaking
    >their employment agreement. Add in the fact that many (most) professional
    >programmers work *some* overtime for their employers and often at odd
    >hours, under pressure, the whole concept of OS is a fantasy to me.
    >

    You've obviously been reading the output of the Alexis de Tocqueville
    Institute. I wonder how much code is really written that way. Open
    Source has been awfully professionalized.

    There are so many different business models: The money is in _______.

    (a) Hardware.
    (b) Software.
    (c) Services.
    (d) Content.

    Give away whatever isn't the source of revenue to tap into whatever
    is. Or, as in the case of open source sotware, use controversial dual
    licensing to give away software to establish it as a standard so you
    can sell it.

    <snip>

    >>>Giving stuff away and giving it to the right people are two different
    >>>scenarios. If you've ever been on the good end of an IBM give-way, you'll
    >>>know that it is not a comfortable position. As for the PC, it is not going
    >>>away any time soon, so there'd better be some vision of how Cell fits into
    >>>that slot... Apple's second chance??:)
    >>>


    <snip>

    >>
    >>But the question, of course, is not, are there interesting things one
    >>might try, but will any of those things actually be made to work and
    >>what do you get as a payoff. It seems reasonably certain you could
    >>make Cell function as a PC processor if you wanted to. The question
    >>is: why would you want to?

    >
    >So my question is: what else (useful) will you do with it?... make ASPs out
    >of it? I don't think so - even IT can't make its politics work there. If
    >you can build game boxes and super computers with it, why not PCs? As
    >Apple's next (or next/next) CPU it may not be that far fetched - obviously
    >they already have the PPC part down.
    >

    Well, but _why_? That's what we have yet to see. Only if it turns
    out that you can give the user a completely different experience, or
    if Apple and IBM can't come to terms on continuing the current
    relationship.

    The other model is that a digital home entertainment center displaces
    the PC. As far as the PC functions are concerned, it's probably more
    of a thin client than a PC. Apple and Sony could do that in
    partnership. I doubt either can do it alone.

    >>David Wang is worried about the software model. That doesn't worry me
    >>so much. The fact that Sony is in such turmoil and has never been
    >>able to make the "profit is in the content" model really pay off for
    >>them (and, as far as I can tell, only Apple, in a field of many
    >>entrants, has succeeded at that game). A weakened and distracted Sony
    >>with a sagging stock price and turmoil at the top is going to turn
    >>aside one of the biggest tidal waves in the history of technology
    >>(x86)?

    >
    >I agree with David - the software environment is necessarily horribly
    >complex and AFAICT at this stage, needs programmers of a calibre which is
    >not commonly found... near genius even. Mr. Trounson's runtime compiler is
    >a *very* old idea, which has had no takers till now.
    >

    "The software is going to be the problem" would have been a pretty
    safe bet through much of the history of computing.

    Sony claims the SPE's can be programmed with c, but the Open Source
    model implicitly assumes that gcc (or equivalent) is the universal
    translator, and it's hard to imagine gcc ever being up to the task of
    taking advantage of SPE's without explicit programmer intervention.

    I'm not sure that the real problem with Cell isn't that it is coming
    along at the wrong time. Too much is already in place, and too much
    would have to be reinvented to get out of Cell even a fraction of the
    potential that might be there. Suppose Cell were the central hardware
    for a Project MAC? Given a blank piece of paper, people can be
    awfully inventive.

    >>>>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
    >>>>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
    >>>>>CPU is kinda slim... without reworking the memory interface to get to 8GB
    >>>>>per CPU;
    >>>>
    >>>>Have you looked at the I/O bandwidth?
    >>>>
    >>>>http://research.scea.com/research/html/CellGDC05/07.html
    >>>>
    >>>>Four cell processors=2GB. Probably no more NUMA than Opteron.
    >>>
    >>>Well it would seem that the inter-CPU communications/coherency is less well
    >>>defined for the moment and there's a *big* difference between the current
    >>>256MB/CPU of Cell and Opteron's 16GB/CPU.
    >>>

    >>Maybe an issue if you want to use it for in-memory databases or a
    >>server, but not so much so for computationally-intensive work.

    >
    >They're not even in the same ballpark. We already hear talk of (PC) game
    >developers raving about the >4GB address space of x86-64 and what they're
    >going to do with it; I guess Sony is not anticipating doing similar things
    >for PS3 players??
    >

    I can easily believe that games will eventually entail very large
    amounts of state. If the memory interface has to be reworked, it has
    to be reworked.

    <snip>

    >>There is always the cautionary tale of itanium (which could wind up
    >>looking more than a little bit like Cell). Intel was much better
    >>positioned than Sony, it had much greater resources, and how far has
    >>it gotten?

    >
    >So you're not tempted to have a little flutter on RMBS? The pump 'n'
    >dumpers seem to have gone cold on it with the Infineon deal - are they not
    >paying attention?:)


    I suspect the markets have already discounted RMBS benefitting from
    Playstation 3, which, after all, is just a followon to Playstation 2.

    RM
     
  11. On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    wrote:

    >On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
    ><fammacd=!SPAM^> wrote:
    >
    >>
    >>Oh I've already read a bit on it and it just doesn't make sense to me. One
    >>case in point: an often mentioned OS factoid has "geeks" playing in their
    >>"spare time" to create software; if, as usually presented, they are also
    >>professional programmers "during the day", they are very likely breaking
    >>their employment agreement. Add in the fact that many (most) professional
    >>programmers work *some* overtime for their employers and often at odd
    >>hours, under pressure, the whole concept of OS is a fantasy to me.
    >>

    >You've obviously been reading the output of the Alexis de Tocqueville
    >Institute. I wonder how much code is really written that way. Open
    >Source has been awfully professionalized.


    No, it's a failry regularly mentioned scenario to describe how OS works -
    do the search yourself. How can you say something is "professionalized"
    when the program design and coding has to be given away?

    >There are so many different business models: The money is in _______.
    >
    >(a) Hardware.
    >(b) Software.
    >(c) Services.
    >(d) Content.
    >
    >Give away whatever isn't the source of revenue to tap into whatever
    >is. Or, as in the case of open source sotware, use controversial dual
    >licensing to give away software to establish it as a standard so you
    >can sell it.


    So the (b) above is not a source of revenue any longer then! So let's say
    I come up with a novel, revolutionary algorithm, e.g. practical solver for
    the traveling salesman problem with true optimal solutions; I then design
    the method for implementation and code it all up. Now I'm supposed to give
    it away because it uses libraries which are OS?

    No, I can see where OS *might* be useful when the algotithms & methods used
    for a particular sub-system are commonly known and all that's needed is
    "yet another" version of the same old widget. Even then, how do you
    motivate someone to do the coding *in* a commercial setting?... IOW not
    some student or graduate who wants to impress?

    ><snip>


    >>>
    >>>But the question, of course, is not, are there interesting things one
    >>>might try, but will any of those things actually be made to work and
    >>>what do you get as a payoff. It seems reasonably certain you could
    >>>make Cell function as a PC processor if you wanted to. The question
    >>>is: why would you want to?

    >>
    >>So my question is: what else (useful) will you do with it?... make ASPs out
    >>of it? I don't think so - even IT can't make its politics work there. If
    >>you can build game boxes and super computers with it, why not PCs? As
    >>Apple's next (or next/next) CPU it may not be that far fetched - obviously
    >>they already have the PPC part down.
    >>

    >Well, but _why_? That's what we have yet to see. Only if it turns
    >out that you can give the user a completely different experience, or
    >if Apple and IBM can't come to terms on continuing the current
    >relationship.


    Why?... the usual quest for better & faster widgets to sell.

    >The other model is that a digital home entertainment center displaces
    >the PC. As far as the PC functions are concerned, it's probably more
    >of a thin client than a PC. Apple and Sony could do that in
    >partnership. I doubt either can do it alone.


    I don't think either needs the other and I don't see why such a powerful
    engine is limited to a thin client. What is it going to connect to for the
    "work"?... not the Internet.

    --
    Rgds, George Macdonald
     
  12. Robert Myers

    Robert Myers Guest

    On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
    <fammacd=!SPAM^> wrote:

    >On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >wrote:
    >
    >>On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
    >><fammacd=!SPAM^> wrote:
    >>
    >>>
    >>>Oh I've already read a bit on it and it just doesn't make sense to me. One
    >>>case in point: an often mentioned OS factoid has "geeks" playing in their
    >>>"spare time" to create software; if, as usually presented, they are also
    >>>professional programmers "during the day", they are very likely breaking
    >>>their employment agreement. Add in the fact that many (most) professional
    >>>programmers work *some* overtime for their employers and often at odd
    >>>hours, under pressure, the whole concept of OS is a fantasy to me.
    >>>

    >>You've obviously been reading the output of the Alexis de Tocqueville
    >>Institute. I wonder how much code is really written that way. Open
    >>Source has been awfully professionalized.

    >
    >No, it's a failry regularly mentioned scenario to describe how OS works -
    >do the search yourself. How can you say something is "professionalized"
    >when the program design and coding has to be given away?
    >
    >>There are so many different business models: The money is in _______.
    >>
    >>(a) Hardware.
    >>(b) Software.
    >>(c) Services.
    >>(d) Content.
    >>
    >>Give away whatever isn't the source of revenue to tap into whatever
    >>is. Or, as in the case of open source sotware, use controversial dual
    >>licensing to give away software to establish it as a standard so you
    >>can sell it.

    >
    >So the (b) above is not a source of revenue any longer then!
    >

    RedHat certainly thinks (b) can be a source of revenue, but Wall
    Street seems increasingly skeptical:

    http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html

    <quote>

    Red Hat (nasdaq: RHAT - news - people ) will report fiscal
    fourth-quarter earnings on Thursday. The Street is expecting earnings
    of 6 cents per share on revenue of $56 million. Earlier this month
    Standard & Poor's Equity Research downgraded to "sell" from "hold" and
    cut the target price, citing expectations for further pricing pressure
    for Linux software and services, which "could negatively impact
    shares" in the near term.

    </quote>

    >>So let's say
    >>I come up with a novel, revolutionary algorithm, e.g. practical solver for
    >>the traveling salesman problem with true optimal solutions; I then design
    >>the method for implementation and code it all up. Now I'm supposed to give
    >>it away because it uses libraries which are OS?


    Highly-specialized software is staying closed source mostly, isn't it?

    >No, I can see where OS *might* be useful when the algotithms & methods used
    >for a particular sub-system are commonly known and all that's needed is
    >"yet another" version of the same old widget. Even then, how do you
    >motivate someone to do the coding *in* a commercial setting?... IOW not
    >some student or graduate who wants to impress?
    >

    Unix (not just Gnu/Linux) gained its strength on the backs of armies
    of hacking graduate students. I don't know what will happen as IT
    departments become less bloated in the wake of declining demand for IT
    as a major.

    >><snip>

    >
    >>>>
    >>>>But the question, of course, is not, are there interesting things one
    >>>>might try, but will any of those things actually be made to work and
    >>>>what do you get as a payoff. It seems reasonably certain you could
    >>>>make Cell function as a PC processor if you wanted to. The question
    >>>>is: why would you want to?
    >>>
    >>>So my question is: what else (useful) will you do with it?... make ASPs out
    >>>of it? I don't think so - even IT can't make its politics work there. If
    >>>you can build game boxes and super computers with it, why not PCs? As
    >>>Apple's next (or next/next) CPU it may not be that far fetched - obviously
    >>>they already have the PPC part down.
    >>>

    >>Well, but _why_? That's what we have yet to see. Only if it turns
    >>out that you can give the user a completely different experience, or
    >>if Apple and IBM can't come to terms on continuing the current
    >>relationship.

    >
    >Why?... the usual quest for better & faster widgets to sell.
    >


    At the price of having to rewrite everything?

    Cell looks to me like the realization of many hardware fantasies, and
    a pretty slick one at that. Now what do we do with it?

    I mean, I can think of *lots* of things to do with Cell. I just don't
    know how many of them are going to get done in a way that will have
    any kind of market impact. Cell looks like a natural dataflow
    processor to me, but how many dataflow programmers are there out
    there?

    In the mid-nineties, people (not just the email that started this
    thread) would be saying that Cell would slay the twin dragons of
    Wintel. People would be fantasizing about who was going to make how
    much money doing it. Gates/Ballmer would be whipping the staff into a
    hysterical frenzy, and Microsoft would be announcing unbelievable
    vaporware. I guess the champagne bottle has just been sitting open
    for too long.

    >>The other model is that a digital home entertainment center displaces
    >>the PC. As far as the PC functions are concerned, it's probably more
    >>of a thin client than a PC. Apple and Sony could do that in
    >>partnership. I doubt either can do it alone.

    >
    >I don't think either needs the other and I don't see why such a powerful
    >engine is limited to a thin client. What is it going to connect to for the
    >"work"?... not the Internet.


    Everybody seems to be talking about what a powerful processor of media
    streams Cell will be. That, other than games for Playstation 3, seems
    to be the only guaranteed application. How much on-the-fly processing
    can media streams absorb, anyway? On-the-fly realization for
    multi-player games? How would I know?

    As to why I'm back on the thin-client bandwagon (never got off it,
    really), it's one easy way to finesse the "That's a really nice chip,
    now where's the software?" problem. Do whatever you find convenient
    locally, do whatever you find inconvenient remotely.

    Don't know how your remote desktops work or if you even use them, but
    I can definitely tell that a remote desktop is remote, even at
    100mbps, using any of the standard tools available to me. I'm
    assuming that with better on-the-fly processing, one could do much
    better, and one will have to do much better to make a thin client over
    the internet acceptable.

    RM
     
  13. keith

    keith Guest

    On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:

    > On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    > wrote:


    >>You've obviously been reading the output of the Alexis de Tocqueville
    >>Institute. I wonder how much code is really written that way. Open
    >>Source has been awfully professionalized.

    >
    > No, it's a failry regularly mentioned scenario to describe how OS works -
    > do the search yourself. How can you say something is "professionalized"
    > when the program design and coding has to be given away?
    >
    >>There are so many different business models: The money is in _______.
    >>
    >>(a) Hardware.
    >>(b) Software.
    >>(c) Services.
    >>(d) Content.
    >>
    >>Give away whatever isn't the source of revenue to tap into whatever
    >>is. Or, as in the case of open source sotware, use controversial dual
    >>licensing to give away software to establish it as a standard so you
    >>can sell it.

    >
    > So the (b) above is not a source of revenue any longer then!


    For some models, no. For others, certainly. It's a matter of what *you*
    decide are your razors are blades.

    > So let's say
    > I come up with a novel, revolutionary algorithm, e.g. practical solver
    > for the traveling salesman problem with true optimal solutions; I then
    > design the method for implementation and code it all up. Now I'm
    > supposed to give it away because it uses libraries which are OS?


    There is no requirement to do this. You can keep *your* code private. If
    that's what you're selling, it even makes sense. ;-)

    > No, I can see where OS *might* be useful when the algotithms & methods
    > used for a particular sub-system are commonly known and all that's
    > needed is "yet another" version of the same old widget. Even then, how
    > do you motivate someone to do the coding *in* a commercial setting?...
    > IOW not some student or graduate who wants to impress?


    OS <> Applications <> algorithms.

    <snip>

    --
    Keith
     
  14. On Mon, 28 Mar 2005 15:59:35 -0500, Robert Myers <>
    wrote:

    >On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
    ><fammacd=!SPAM^> wrote:
    >
    >>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >>wrote:
    >>


    >>>You've obviously been reading the output of the Alexis de Tocqueville
    >>>Institute. I wonder how much code is really written that way. Open
    >>>Source has been awfully professionalized.

    >>
    >>No, it's a failry regularly mentioned scenario to describe how OS works -
    >>do the search yourself. How can you say something is "professionalized"
    >>when the program design and coding has to be given away?
    >>
    >>>There are so many different business models: The money is in _______.
    >>>
    >>>(a) Hardware.
    >>>(b) Software.
    >>>(c) Services.
    >>>(d) Content.
    >>>
    >>>Give away whatever isn't the source of revenue to tap into whatever
    >>>is. Or, as in the case of open source sotware, use controversial dual
    >>>licensing to give away software to establish it as a standard so you
    >>>can sell it.

    >>
    >>So the (b) above is not a source of revenue any longer then!
    >>

    >RedHat certainly thinks (b) can be a source of revenue, but Wall
    >Street seems increasingly skeptical:


    I thought we were talking about earning $$ from developing software -
    paying analysts and programmers to design and write code. When you pay
    RedHat it's to cover all the ancillaries, like advertising, packaging
    admin. etc. plus AIUI, some form of support.

    >http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html
    >
    ><quote>
    >
    >Red Hat (nasdaq: RHAT - news - people ) will report fiscal
    >fourth-quarter earnings on Thursday. The Street is expecting earnings
    >of 6 cents per share on revenue of $56 million. Earlier this month
    >Standard & Poor's Equity Research downgraded to "sell" from "hold" and
    >cut the target price, citing expectations for further pricing pressure
    >for Linux software and services, which "could negatively impact
    >shares" in the near term.
    >
    ></quote>
    >
    >>>So let's say
    >>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
    >>>the traveling salesman problem with true optimal solutions; I then design
    >>>the method for implementation and code it all up. Now I'm supposed to give
    >>>it away because it uses libraries which are OS?

    >
    >Highly-specialized software is staying closed source mostly, isn't it?


    There's a huge (dynamic) fuzzy area there - today's technology is
    tomorrow's commodity of course but maybe you're right: software is about to
    enter a new era where it leaves behind the whoring-model... "ya got it, ya
    sell it... ya still got it".;-)

    >>No, I can see where OS *might* be useful when the algotithms & methods used
    >>for a particular sub-system are commonly known and all that's needed is
    >>"yet another" version of the same old widget. Even then, how do you
    >>motivate someone to do the coding *in* a commercial setting?... IOW not
    >>some student or graduate who wants to impress?
    >>

    >Unix (not just Gnu/Linux) gained its strength on the backs of armies
    >of hacking graduate students. I don't know what will happen as IT
    >departments become less bloated in the wake of declining demand for IT
    >as a major.


    Ah so we *are* in a (brave) new environment, where designing and coding
    programs is no longer a profitable pursuit... unless you have a novel
    algorithmic twist?

    >>><snip>


    >>>Well, but _why_? That's what we have yet to see. Only if it turns
    >>>out that you can give the user a completely different experience, or
    >>>if Apple and IBM can't come to terms on continuing the current
    >>>relationship.

    >>
    >>Why?... the usual quest for better & faster widgets to sell.
    >>

    >
    >At the price of having to rewrite everything?


    Ya mean like Itanium?:) I'd gotten the impression that the mundane stuff
    would just run on the PPC core and then... for newer creative stuff you
    could get more adventurous with the SPEs - no? IOW whatever fits in the
    porta-"C" category, and much of that is not performance-critical, just do
    it - the real bonus is in the rest.

    >Cell looks to me like the realization of many hardware fantasies, and
    >a pretty slick one at that. Now what do we do with it?
    >
    >I mean, I can think of *lots* of things to do with Cell. I just don't
    >know how many of them are going to get done in a way that will have
    >any kind of market impact. Cell looks like a natural dataflow
    >processor to me, but how many dataflow programmers are there out
    >there?
    >
    >In the mid-nineties, people (not just the email that started this
    >thread) would be saying that Cell would slay the twin dragons of
    >Wintel. People would be fantasizing about who was going to make how
    >much money doing it. Gates/Ballmer would be whipping the staff into a
    >hysterical frenzy, and Microsoft would be announcing unbelievable
    >vaporware. I guess the champagne bottle has just been sitting open
    >for too long.


    After Alpha, and err, Itanium, plus MIPs & Risc-6K/Power in the Windows
    arena, it gets harder to get excited.... sobriety?:)

    >>>The other model is that a digital home entertainment center displaces
    >>>the PC. As far as the PC functions are concerned, it's probably more
    >>>of a thin client than a PC. Apple and Sony could do that in
    >>>partnership. I doubt either can do it alone.

    >>
    >>I don't think either needs the other and I don't see why such a powerful
    >>engine is limited to a thin client. What is it going to connect to for the
    >>"work"?... not the Internet.

    >
    >Everybody seems to be talking about what a powerful processor of media
    >streams Cell will be. That, other than games for Playstation 3, seems
    >to be the only guaranteed application. How much on-the-fly processing
    >can media streams absorb, anyway? On-the-fly realization for
    >multi-player games? How would I know?
    >
    >As to why I'm back on the thin-client bandwagon (never got off it,
    >really), it's one easy way to finesse the "That's a really nice chip,
    >now where's the software?" problem. Do whatever you find convenient
    >locally, do whatever you find inconvenient remotely.
    >
    >Don't know how your remote desktops work or if you even use them, but
    >I can definitely tell that a remote desktop is remote, even at
    >100mbps, using any of the standard tools available to me. I'm
    >assuming that with better on-the-fly processing, one could do much
    >better, and one will have to do much better to make a thin client over
    >the internet acceptable.


    When Larry E. first proposed his thin client "idea" [I know there were
    others but L.E. had the *big* $$ and *big* motivation] I came up with the
    term SQL*Nuts... kinda like the way his err, personal assistant was known
    as SQL*Slut. I haven't changed my mind. Now we've had IT-folk dreaming of
    the return of the glass houses but it still doesn't seem to be going
    anywhere fast. As I've said before: people hate public transportation.

    --
    Rgds, George Macdonald
     
  15. On Mon, 28 Mar 2005 22:28:26 -0500, keith <> wrote:

    >On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
    >
    >> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >> wrote:

    >
    >>>You've obviously been reading the output of the Alexis de Tocqueville
    >>>Institute. I wonder how much code is really written that way. Open
    >>>Source has been awfully professionalized.

    >>
    >> No, it's a failry regularly mentioned scenario to describe how OS works -
    >> do the search yourself. How can you say something is "professionalized"
    >> when the program design and coding has to be given away?
    >>
    >>>There are so many different business models: The money is in _______.
    >>>
    >>>(a) Hardware.
    >>>(b) Software.
    >>>(c) Services.
    >>>(d) Content.
    >>>
    >>>Give away whatever isn't the source of revenue to tap into whatever
    >>>is. Or, as in the case of open source sotware, use controversial dual
    >>>licensing to give away software to establish it as a standard so you
    >>>can sell it.

    >>
    >> So the (b) above is not a source of revenue any longer then!

    >
    >For some models, no. For others, certainly. It's a matter of what *you*
    >decide are your razors are blades.


    Whatever is err, patentable?;-)

    >> So let's say
    >> I come up with a novel, revolutionary algorithm, e.g. practical solver
    >> for the traveling salesman problem with true optimal solutions; I then
    >> design the method for implementation and code it all up. Now I'm
    >> supposed to give it away because it uses libraries which are OS?

    >
    >There is no requirement to do this. You can keep *your* code private. If
    >that's what you're selling, it even makes sense. ;-)


    I'd rather pay for the OS, compiler and libraries and compete, unfettered
    by GPL-like impositions, on an even field.

    >> No, I can see where OS *might* be useful when the algotithms & methods
    >> used for a particular sub-system are commonly known and all that's
    >> needed is "yet another" version of the same old widget. Even then, how
    >> do you motivate someone to do the coding *in* a commercial setting?...
    >> IOW not some student or graduate who wants to impress?

    >
    >OS <> Applications <> algorithms.


    Of course, but there are obvious inter-dependencies. It also depends what
    is meant by an OS, which is generally assumed to include a certain
    repertoire of utility "apps". There are algorithms within algorithms and
    nothing "works" without them. BTW I am vehemently opposed to patenting of
    algorithms - we've seen enough damage there.

    --
    Rgds, George Macdonald
     
  16. On Mon, 28 Mar 2005 22:28:26 -0500, keith <> wrote:

    >On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
    >
    >> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >> wrote:

    >
    >>>You've obviously been reading the output of the Alexis de Tocqueville
    >>>Institute. I wonder how much code is really written that way. Open
    >>>Source has been awfully professionalized.

    >>
    >> No, it's a failry regularly mentioned scenario to describe how OS works -
    >> do the search yourself. How can you say something is "professionalized"
    >> when the program design and coding has to be given away?
    >>
    >>>There are so many different business models: The money is in _______.
    >>>
    >>>(a) Hardware.
    >>>(b) Software.
    >>>(c) Services.
    >>>(d) Content.
    >>>
    >>>Give away whatever isn't the source of revenue to tap into whatever
    >>>is. Or, as in the case of open source sotware, use controversial dual
    >>>licensing to give away software to establish it as a standard so you
    >>>can sell it.

    >>
    >> So the (b) above is not a source of revenue any longer then!

    >
    >For some models, no. For others, certainly. It's a matter of what *you*
    >decide are your razors are blades.
    >
    >> So let's say
    >> I come up with a novel, revolutionary algorithm, e.g. practical solver
    >> for the traveling salesman problem with true optimal solutions; I then
    >> design the method for implementation and code it all up. Now I'm
    >> supposed to give it away because it uses libraries which are OS?

    >
    >There is no requirement to do this. You can keep *your* code private. If
    >that's what you're selling, it even makes sense. ;-)
    >
    >> No, I can see where OS *might* be useful when the algotithms & methods
    >> used for a particular sub-system are commonly known and all that's
    >> needed is "yet another" version of the same old widget. Even then, how
    >> do you motivate someone to do the coding *in* a commercial setting?...
    >> IOW not some student or graduate who wants to impress?

    >
    >OS <> Applications <> algorithms.
    >
    ><snip>
    >
    >--
    > Keith


    --
    Rgds, George Macdonald
     
  17. Robert Myers

    Robert Myers Guest

    On Tue, 29 Mar 2005 15:58:54 -0500, George Macdonald
    <fammacd=!SPAM^> wrote:

    >On Mon, 28 Mar 2005 15:59:35 -0500, Robert Myers <>
    >wrote:
    >
    >>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
    >><fammacd=!SPAM^> wrote:
    >>
    >>>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >>>wrote:
    >>>

    >
    >>>>You've obviously been reading the output of the Alexis de Tocqueville
    >>>>Institute. I wonder how much code is really written that way. Open
    >>>>Source has been awfully professionalized.
    >>>
    >>>No, it's a failry regularly mentioned scenario to describe how OS works -
    >>>do the search yourself. How can you say something is "professionalized"
    >>>when the program design and coding has to be given away?
    >>>
    >>>>There are so many different business models: The money is in _______.
    >>>>
    >>>>(a) Hardware.
    >>>>(b) Software.
    >>>>(c) Services.
    >>>>(d) Content.
    >>>>
    >>>>Give away whatever isn't the source of revenue to tap into whatever
    >>>>is. Or, as in the case of open source sotware, use controversial dual
    >>>>licensing to give away software to establish it as a standard so you
    >>>>can sell it.
    >>>
    >>>So the (b) above is not a source of revenue any longer then!
    >>>

    >>RedHat certainly thinks (b) can be a source of revenue, but Wall
    >>Street seems increasingly skeptical:

    >
    >I thought we were talking about earning $$ from developing software -
    >paying analysts and programmers to design and write code. When you pay
    >RedHat it's to cover all the ancillaries, like advertising, packaging
    >admin. etc. plus AIUI, some form of support.
    >
    >>http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html
    >>
    >><quote>
    >>
    >>Red Hat (nasdaq: RHAT - news - people ) will report fiscal
    >>fourth-quarter earnings on Thursday. The Street is expecting earnings
    >>of 6 cents per share on revenue of $56 million. Earlier this month
    >>Standard & Poor's Equity Research downgraded to "sell" from "hold" and
    >>cut the target price, citing expectations for further pricing pressure
    >>for Linux software and services, which "could negatively impact
    >>shares" in the near term.
    >>
    >></quote>
    >>
    >>>>So let's say
    >>>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
    >>>>the traveling salesman problem with true optimal solutions; I then design
    >>>>the method for implementation and code it all up. Now I'm supposed to give
    >>>>it away because it uses libraries which are OS?

    >>
    >>Highly-specialized software is staying closed source mostly, isn't it?

    >
    >There's a huge (dynamic) fuzzy area there - today's technology is
    >tomorrow's commodity of course but maybe you're right: software is about to
    >enter a new era where it leaves behind the whoring-model... "ya got it, ya
    >sell it... ya still got it".;-)
    >

    One of the very few things Edward Teller said that I agreed with was
    that the things that really make a difference in national security
    don't need to be classified because you can't write down, transmit, or
    easily steal the secrets, anyway. The prizes of World War II were the
    actual rocket scientists, not their blueprints or even prototypes.

    Players more or less _have_ to contribute to these communal efforts,
    and their assets are the people who really understand what's going on.
    Take your eye off the ball for a short period, and you're quickly out
    of the game.

    You don't want RedHat's actual packaged software? No problem. But if
    it breaks, you're on your own or on the mercy of community resources.
    That's neither free software nor commercial software, but RedHat _is_
    making money off software.

    From an end user's point of view, I don't know that the biggest
    concern works much differently either way. Unless your favorite
    software is kept up to date so that it can live happily with the
    latest kernel, you could be out of luck. Have it happen to you just
    once, spend some time digging through mail lists trying to figure out
    how the kernel headers changed, and you realize what a problem it is.
    Wouldn't happen with commercial software? Look at your prized watcom
    compiler.

    There is so much room for creativity that I don't really see that the
    GPL is all that much of a hindrance to making money. This is
    _America_, George.

    >>>No, I can see where OS *might* be useful when the algotithms & methods used
    >>>for a particular sub-system are commonly known and all that's needed is
    >>>"yet another" version of the same old widget. Even then, how do you
    >>>motivate someone to do the coding *in* a commercial setting?... IOW not
    >>>some student or graduate who wants to impress?
    >>>

    >>Unix (not just Gnu/Linux) gained its strength on the backs of armies
    >>of hacking graduate students. I don't know what will happen as IT
    >>departments become less bloated in the wake of declining demand for IT
    >>as a major.

    >
    >Ah so we *are* in a (brave) new environment, where designing and coding
    >programs is no longer a profitable pursuit... unless you have a novel
    >algorithmic twist?
    >

    I think having an identified target market with money is more
    important than having a novel algorithmic twist.

    >>>><snip>

    >
    >>>>Well, but _why_? That's what we have yet to see. Only if it turns
    >>>>out that you can give the user a completely different experience, or
    >>>>if Apple and IBM can't come to terms on continuing the current
    >>>>relationship.
    >>>
    >>>Why?... the usual quest for better & faster widgets to sell.
    >>>

    >>
    >>At the price of having to rewrite everything?

    >
    >Ya mean like Itanium?:) I'd gotten the impression that the mundane stuff
    >would just run on the PPC core and then... for newer creative stuff you
    >could get more adventurous with the SPEs - no? IOW whatever fits in the
    >porta-"C" category, and much of that is not performance-critical, just do
    >it - the real bonus is in the rest.
    >

    I don't think so. The PowerPC part of Cell is really crippled
    relative to a G5. You really have to be able to exploit the SPE's to
    make Cell competitive, and I don't think any compiler anywhere is
    going to compile c or c++ to effective Cell software because the
    programming model is so different.

    Instead of letting the PowerPC do actual work, you let it create a
    thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
    the task, you don't care so much because it's only 1 of 16, whereas
    the PPC has only two paths, both of them in-order.

    The natural programming model is something like Kahn networks or
    Synchronous Dataflow. Lots of work done, but applications would have
    to be rewritten at the source code level.

    >>Cell looks to me like the realization of many hardware fantasies, and
    >>a pretty slick one at that. Now what do we do with it?
    >>
    >>I mean, I can think of *lots* of things to do with Cell. I just don't
    >>know how many of them are going to get done in a way that will have
    >>any kind of market impact. Cell looks like a natural dataflow
    >>processor to me, but how many dataflow programmers are there out
    >>there?
    >>
    >>In the mid-nineties, people (not just the email that started this
    >>thread) would be saying that Cell would slay the twin dragons of
    >>Wintel. People would be fantasizing about who was going to make how
    >>much money doing it. Gates/Ballmer would be whipping the staff into a
    >>hysterical frenzy, and Microsoft would be announcing unbelievable
    >>vaporware. I guess the champagne bottle has just been sitting open
    >>for too long.

    >
    >After Alpha, and err, Itanium, plus MIPs & Risc-6K/Power in the Windows
    >arena, it gets harder to get excited.... sobriety?:)
    >

    But I'm not sure it isn't going to happen this time. We _are_ moving
    from single-processor to multi-processor execution. That train is
    leaving the station, with or without Cell. Now that I've seen Cell,
    though, I really like the possiblities.

    RM
     
  18. keith

    keith Guest

    On Tue, 29 Mar 2005 15:58:55 -0500, George Macdonald wrote:

    > On Mon, 28 Mar 2005 22:28:26 -0500, keith <> wrote:
    >
    >>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
    >>
    >>> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <>
    >>> wrote:

    >>
    >>>>You've obviously been reading the output of the Alexis de Tocqueville
    >>>>Institute. I wonder how much code is really written that way. Open
    >>>>Source has been awfully professionalized.
    >>>
    >>> No, it's a failry regularly mentioned scenario to describe how OS works -
    >>> do the search yourself. How can you say something is "professionalized"
    >>> when the program design and coding has to be given away?
    >>>
    >>>>There are so many different business models: The money is in _______.
    >>>>
    >>>>(a) Hardware.
    >>>>(b) Software.
    >>>>(c) Services.
    >>>>(d) Content.
    >>>>
    >>>>Give away whatever isn't the source of revenue to tap into whatever
    >>>>is. Or, as in the case of open source sotware, use controversial dual
    >>>>licensing to give away software to establish it as a standard so you
    >>>>can sell it.
    >>>
    >>> So the (b) above is not a source of revenue any longer then!

    >>
    >>For some models, no. For others, certainly. It's a matter of what *you*
    >>decide are your razors are blades.

    >
    > Whatever is err, patentable?;-)


    You forget that IBM turned over 500ish patents to the open-software
    community. You're not looking beyond the razors. You've just flunked
    Gillette marketing 101. ;-)

    >>> So let's say
    >>> I come up with a novel, revolutionary algorithm, e.g. practical solver
    >>> for the traveling salesman problem with true optimal solutions; I then
    >>> design the method for implementation and code it all up. Now I'm
    >>> supposed to give it away because it uses libraries which are OS?

    >>
    >>There is no requirement to do this. You can keep *your* code private. If
    >>that's what you're selling, it even makes sense. ;-)

    >
    > I'd rather pay for the OS, compiler and libraries and compete, unfettered
    > by GPL-like impositions, on an even field.


    You are not "fettered" by having used GPL tools. You may indeed sell your
    tools as OCO. IIRC, you may not package that code as part of yours. I'm
    not a frappin' programmer <spit>, but that's my understanding.

    Your understangin of emplouer relationships is a little out of date too.
    Many are encourraged to participate in OSS, within obvious conflict of
    interest barriers, obviously.

    >>> No, I can see where OS *might* be useful when the algotithms

    & methods
    >>> used for a particular sub-system are commonly known and all that's
    >>> needed is "yet another" version of the same old widget. Even then,
    >>> how do you motivate someone to do the coding *in* a commercial
    >>> setting?... IOW not some student or graduate who wants to impress?

    >>
    >>OS <> Applications <> algorithms.

    >
    > Of course, but there are obvious inter-dependencies. It also depends
    > what is meant by an OS, which is generally assumed to include a certain
    > repertoire of utility "apps". There are algorithms within algorithms
    > and nothing "works" without them. BTW I am vehemently opposed to
    > patenting of algorithms - we've seen enough damage there.


    I'm not sure I agree. I'm not sure I understand the difference between an
    algorithm and a process. Ok, I do work in the patent arena, but I do shy
    away from anything with software in it. Processes aren't software though,
    but it could easily be argued that they are algorithms. I'm not smart
    enough to know the difference. You?

    --
    Keith
     
  19. "Robert Myers" <> wrote in message
    news:...
    >

    snip
    > >

    > One of the very few things Edward Teller said that I agreed with was
    > that the things that really make a difference in national security
    > don't need to be classified because you can't write down, transmit, or
    > easily steal the secrets, anyway. The prizes of World War II were the
    > actual rocket scientists, not their blueprints or even prototypes.


    And after while you could buy atom bomb kits in Pakistani supermarkets
    under the AQ Khan brand.
    Get a few grad students to put them together. Still as dangerous as
    they were back when you needed exotic scientists.

    Or maybe that cute little suitcase size nuke, the W31 as I recall, that
    the Chinese ended up cloning.

    I think if Teller really said that he was mistaken.

    It took Shockley et al to make the first transistor. Not any more.
    It took a genius at IBM to make the first high temp superconductor. Now
    High School kids can make them.

    If I had the secret formula I could make Coke. I wouldn't need exotic
    training or skills.

    snip

    > I don't think so. The PowerPC part of Cell is really crippled
    > relative to a G5. You really have to be able to exploit the SPE's to
    > make Cell competitive, and I don't think any compiler anywhere is
    > going to compile c or c++ to effective Cell software because the
    > programming model is so different.
    >
    > Instead of letting the PowerPC do actual work, you let it create a
    > thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
    > the task, you don't care so much because it's only 1 of 16, whereas
    > the PPC has only two paths, both of them in-order.
    >
    > The natural programming model is something like Kahn networks or
    > Synchronous Dataflow. Lots of work done, but applications would have
    > to be rewritten at the source code level.
    >

    snip
    > But I'm not sure it isn't going to happen this time. We _are_ moving
    > from single-processor to multi-processor execution. That train is
    > leaving the station, with or without Cell. Now that I've seen Cell,
    > though, I really like the possiblities.
    >
    > RM


    I would say that there are folks, perhaps the ones at Sony, who think
    that in the long run or maybe even the medium run that wintel will go
    the way of the dinosaur or maybe the vector supercomputer. :)

    del
     
  20. Robert Myers

    Robert Myers Guest

    On Wed, 30 Mar 2005 02:57:14 GMT, "Delbert Cecchi"
    <> wrote:

    >
    >"Robert Myers" <> wrote in message
    >news:...
    >>

    >snip
    >> >

    >> One of the very few things Edward Teller said that I agreed with was
    >> that the things that really make a difference in national security
    >> don't need to be classified because you can't write down, transmit, or
    >> easily steal the secrets, anyway. The prizes of World War II were the
    >> actual rocket scientists, not their blueprints or even prototypes.

    >
    >And after while you could buy atom bomb kits in Pakistani supermarkets
    >under the AQ Khan brand.
    >Get a few grad students to put them together. Still as dangerous as
    >they were back when you needed exotic scientists.
    >
    >Or maybe that cute little suitcase size nuke, the W31 as I recall, that
    >the Chinese ended up cloning.
    >
    >I think if Teller really said that he was mistaken.
    >
    >It took Shockley et al to make the first transistor. Not any more.
    >It took a genius at IBM to make the first high temp superconductor. Now
    >High School kids can make them.
    >
    >If I had the secret formula I could make Coke. I wouldn't need exotic
    >training or skills.
    >

    So, as we have discovered, if one country does the proof of principle,
    and only the vaguest outlines of how it's done can be discovered, a
    determined adversary can often duplicate the results, even under very
    challenging circumstances. Keeping things secret doesn't do much
    good.

    An example of what Teller was talking about (and I can't find the
    exact quote, but you can easily find quotes of him advocating drastic
    changes to the country's secrecy policies) was the inadvertent
    shipment of machines to make precision ball bearings to the Soviet
    Union at the height of the cold war. That slip allowed them to MIRV
    their warheads, a major escalation of the arms race. The Soviets
    didn't know how to make ball bearings? Apparently not.

    >snip
    >
    >> I don't think so. The PowerPC part of Cell is really crippled
    >> relative to a G5. You really have to be able to exploit the SPE's to
    >> make Cell competitive, and I don't think any compiler anywhere is
    >> going to compile c or c++ to effective Cell software because the
    >> programming model is so different.
    >>
    >> Instead of letting the PowerPC do actual work, you let it create a
    >> thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
    >> the task, you don't care so much because it's only 1 of 16, whereas
    >> the PPC has only two paths, both of them in-order.
    >>
    >> The natural programming model is something like Kahn networks or
    >> Synchronous Dataflow. Lots of work done, but applications would have
    >> to be rewritten at the source code level.
    >>

    >snip
    >> But I'm not sure it isn't going to happen this time. We _are_ moving
    >> from single-processor to multi-processor execution. That train is
    >> leaving the station, with or without Cell. Now that I've seen Cell,
    >> though, I really like the possiblities.
    >>

    >
    >I would say that there are folks, perhaps the ones at Sony, who think
    >that in the long run or maybe even the medium run that wintel will go
    >the way of the dinosaur or maybe the vector supercomputer. :)
    >

    Cell has both the interconnect bandwidth and the execution paths to
    make a worthy successor to vector supercomputers.

    As to the actual prospects? Who wouldn't be cautious at this point?
    The age imbalance (with some exceptions :) ) in who is showing
    interest and excitement and who is huffily standoffish is striking.

    RM
     
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.

Share This Page