1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

is it possible to modding a geforce 8800 into a quadro ?

Discussion in 'Nvidia' started by g. bon, Mar 10, 2008.

  1. g. bon

    g. bon Guest

    Is there any hardware or software method to transform a geforce 8800 (GTS -
    512MB) into a quadro ?
    Where can we found the method to do it ?

    thanks,

    GB
     
    g. bon, Mar 10, 2008
    #1
    1. Advertisements

  2. g. bon

    DaveW Guest

    Wrong. The hardware and GPU are different. No can do.
     
    DaveW, Mar 10, 2008
    #2
    1. Advertisements

  3. g. bon

    Augustus Guest

    Wrong again, Dave W. I guess that's why the Quadro FX 4600 and 5600 use the
    G80 90nm GPU which, wonder of wonders, is the exact same G80 GPU as the
    8800GTX and 8800 Ultra. It can be done via softmod, but it's messy and does
    not work as well as previous softmods of older cards. I would not recommend
    it. Tons of threads out there, but this one is typical

    http://forums.guru3d.com/showthread.php?t=220942&highlight=8800+quadro
     
    Augustus, Mar 10, 2008
    #3
  4. g. bon

    deimos Guest

    NVIDIA's chip designation is based on the same cores (G80, G84, etc),
    but they seem to be adding "GL onto them to differentiate Quadro's and I
    doubt this is merely marketing. Older cards (GF6 series and before)
    were easily modded to Quadro's with RivaTuner, but newer ones are a
    total pain in the ass and professional applications like Solidworks may
    not detect them as a Quadro.
     
    deimos, Mar 11, 2008
    #4
  5. g. bon

    Augustus Guest

    NVIDIA's chip designation is based on the same cores (G80, G84, etc), but
    It's more the case that the GL extensions and functions are enabled by the
    Quadro BIOS and Quadro drivers, rather than the G80 architecture is
    physically different. The GPU, shaders and memory have different different
    clock settings. Memory size is larger, as is the PCB. The Nvidia CUDA
    Programming Guide 1.1 has most of this stuff in it.
     
    Augustus, Mar 11, 2008
    #5
  6. * DaveW:
    Still no clue what you're talking about, wanker? FYI: the GPUs of Quadro
    and Geforce are identical.

    How about a nice cup of shut the **** up?

    Benjamin
     
    Benjamin Gawert, Mar 11, 2008
    #6
  7. * deimos:
    The GPUs are 100% identical, that's a fact. The difference lies in the
    functionality that is activated by the driver.
    That's because changes in the hardware made it more different to change
    the GPU ID from Geforce to Quadro.

    Benjamin
     
    Benjamin Gawert, Mar 11, 2008
    #7
  8. g. bon

    deimos Guest

    Do you have a source on this? I used to think the same, but I've seen
    anecdotal evidence that the cores have different transistor counts and
    certainly there are different surface components (accounting for the
    12-bit color output precision and other media production centric features).
    Many professional applications like the ones I work with actually have
    some code for Quadro specific detection that goes beyond just the driver
    or device ID, so there must be an API method of detecting Quadro
    features other than just DXCaps or GL extensions.
     
    deimos, Mar 12, 2008
    #8
  9. * deimos:
    Yes, but none that is publically available.
    These "evidences" are, as you said already, anecdotal.
    Which applications are that and how exactly are they trying to detect a
    Quadro beyond driver information or device ID?
    Nope, there isn't.

    Benjamin
     
    Benjamin Gawert, Mar 12, 2008
    #9
  10. g. bon

    Mr.E Solved! Guest

    No, but you are right in that it is the GL extensions recognition that
    make up the bulk of the performance difference betwixt the two cards.

    The chippies are the same: both CUDA GPUs, it's the drivers that are
    different, different capabilities and ways of using the circuitry.

    Also, a minor but interesting point, Quadros are severely stressed
    before being put into the channel, the blue screen and nv4_disp.dll
    errors that GeForce users see with too much frequency (for any reason)
    you just don't get with Quadros, they are designed to be flawless 24/7,
    like a CPU.
     
    Mr.E Solved!, Mar 12, 2008
    #10
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.