1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Skybuck's review of GTC 2013 key note.

Discussion in 'Nvidia' started by Skybuck Flying, Mar 20, 2013.

  1. Hello,

    The GTC 2013 keynote wasn't as exciting as previous years, it was actually
    quite boring except for a few nice things and even one slightly exciting
    thing.

    Many items were discussed which were already presented last year, like path
    tracing/ray tracing, car sales, remote visualization and so forth, that part
    of the presentation was a waste of time.
    (Also mention of titan and software modem and software camera was already
    presented elsewhere ;))

    So I will skip over the waste of time/old re-hash of things and mention the
    new stuff:

    1. WaveWorks it was called, Ocean simulation software. How this is usefull
    remains to be seen, maybe a nice u-boot game, other than that perhaps quite
    boring, still fun to look at though.

    2. FaceWorks, this was interesting, but a slightly better visualization of a
    human face, the guy was kinda butt ugly lol, I would prefer a beautifull
    girl to look at to replace dawn... but ok... at least his buttface had nice
    poors/pimples/shavings and such to show that...

    This was quite cool to look at and quite amazing... a thing to come for the
    future... nice to see this kind of visualization reach the next level.
    Mostly research done by some kind of university.

    3. Volta. Codename for a gpu with stacked 3d memory chips. This kinda
    reminds me of my own idea which I probably wrote about on usenet. This was
    the most exciting part of the presentation. My feeling almost says it's too
    good to be true.

    Is it a smoke screen ? Is it a fantasy ? Does nvidia expect to go bankrupt
    the next few years so they can afford to produce these kinds of nonsense
    stories lol ?! ;) :) Or is it a real product to come... only time can tell
    ;) :)

    I sure believe in it... but it's too good to be true... so the sad part is
    it will only come around 2015, 2016 or so... I also wonder if it will solve
    the gpu <-> memory latency problem... I hope it does that would be a major
    innovation and have far reaching consequences for software/computer industry
    it would and probably could be nothing short of a revolution... how much
    more performance it will give is anybody's guess... he did mention
    reading/pushing an entire blue-ray in 1/50 of a second... so 1 terrabyte/sec
    or so...

    Volta is the thing I am most looking forward too.

    4. There was another mention of arm+gpu called Kayla... a very nice
    efficient lower power/watt thing that can still run like a today s super gpu
    or something... sounds cool. Kayla is the girlfriend chip of Logan...
    another chip like it... Logan confuses me a little bit... it's probably
    tegra 4 or 5 or something.

    5. There are some architectures in between like maxwell... I dont care about
    that architecture... I rather seem that skip that one... but ok... maxwell
    is apperently important for unified memory... I wonder if that will slow
    things down or not... maybe it's an essential step so maybe it cannot be
    skipped.

    There were some other mentions like SolidWorks, and a Nvidia Grid...
    etc/blablablablabla... but for a programmer that's not really that
    exciting... it's more for commercial/bussiness people... looking to buy new
    computers/gadgets/computing solutions...

    The grid thing did look interesting.... 16 workstations in a box... but I
    wonder if it will melt lol ;) :) what about data backup ? This device seems
    ill-convinced and a major risk to companies... it will probably disappear
    from the market real fast ;) :)

    I always hope for Delphi/More languages to be included for the cuda
    compiler, unfortunately no word on that. Delphi/Pascal perfect language for
    learning how to program so it would go well hand-in-hand for learning how to
    program for newbs and parallel programming at the same time.
    Delphi/Pascal also could for prototyping software... I can't help but feel
    that leaving out pascal support is a big mistake ;) :) Even Adobe Photoshop
    started in pascal as source code release revealed ! ;) :)

    Anyway enough about my plea for Delphi/Pascal support... who knows... maybe
    embarcaderro is secretly working on it... or will start secretly working on
    it ;)

    I can't help but feel that this GTC conference is about "stealing credit"
    for other peoples work. Nvidia/CEO is presenting all kinds of great work of
    others and tries to rub that of on their/nvidia company...

    Yes to some part nvidia makes it all possible thanks to their hardware, but
    no it's not nvidia's software, it's those companies doing the hard software
    work that prevents these chips/devices from being expensive doorsteps ;) :)

    I personally would much more like GTC to be about how NVIDIA helps those
    software companies and programmers around the world achieve more
    performance, more productivity, more new programming features and solve real
    bottlenecks.

    If nvidia cannot solve those bottlenecks and cannot truely deliver more
    performance either in total or per watt than maybe it's better not to
    organize a GTC until there is something subtantial to share... now it was
    mostly just boring and a waste of time...

    What's cooking for next year ? More car sale software ? More face software ?
    More ray tracing software ? More hollywood software ? More cad/cam software
    ? More genome software ? I for one have already seen it past year(s) and
    this year.

    So I hope next year not again... unless there are subtantial improvements to
    be reported or something truely new or updated...

    Well at least the CEO was fair at the start... hinting at "updates" which
    kinda gave the viewer a hint that there werent that many new/great things to
    report... and just a status update.

    Well fair enough... not each year can be a big break through or a big
    success, but here is to hoping towards the future ;) :)

    Well funny last point of critique though... he did lie a bit about the titan
    super computer... it's not in operation... the gold connectors failed.

    Perhaps that's not nvidia's fault but cray's fault... but I am pretty sure
    the guys in the audience know about that... no need to lie about it ;) :)

    Perhaps a joke about it would have been much better ;) these kinds of things
    can happen... so nothing to be ashamed about me thinks ;)

    Perhaps more attention to nsight and it's new features might have been
    nice... though some screenshot already seen... but I think I would have like
    that a bit more... than again could also be boring.

    My advice for NVIDIA is quite simple: invest more money in compiler
    development for different programming languages, even a few thousands of
    dollars can go a long way for bounties.

    With their big budgets it's amazing that the only language they seem to
    support is cuda c and some c++.

    This leaves out a whole lot/range of programmers which sucks pretty bad.

    I think programmers all around the world would be much more exciting about
    more language support.

    I do understand nvidia wanting to invest all their money into hardware and
    stuff like that... but that would be somewhat foolish... why would anybody
    write a compiler for their chips ? Writing a compiler is hard/difficult
    work... very few people/companies going to do that for free..

    Some might do it if there is money to be made, but very few people can
    actually successfully do it, nobody knows if it will sell, so incentive to
    invest in it is near zero, it's unknown, it's a gamble, even more reason for
    nivida to make an "exploration investment" as I like to think of those kinds
    of investments. The return on investment is zero maybe even negative in a
    spreadsheet, nobody knows what it will return. These are the hardest and
    most darring investments to make. Very few people on planet earth dare to
    make such investments. Those who do dare could end up insanely rich as Bill
    Gates and Steve Ballmer have proven. Did they know they would have such a
    big return on investment... not really... they could only dream of it ;) :)

    Bye,
    Skybuck.
     
    Skybuck Flying, Mar 20, 2013
    #1
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.