1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Autodetecting number of video outputs from NVIDIA driver.

Discussion in 'Nvidia' started by Zem, Oct 6, 2005.

  1. Zem

    Zem Guest

    Hi,

    I am writing an application running in linux that tries to auto-detect
    some of the capabilities of the nvidia cards that are installed in my
    system. Specifically, I would like to be able to query different Quadro
    cards to get the number of video outputs (aka display connectors) found
    on each card. I have looked at the output of /proc/pci/nvidia/cards/0
    but this doesn't give me any information other than the GPU. I was
    looking into using the Device IDs provided by nvidia on their website
    and building a lookup table but this is the least optimal way of doing.
    Is there any way to query the linux nvidia driver (1.0-7676) using
    nvidia's sdk to get this information or are there any utilities
    provided by nvidia for doing this?

    Thanks!
     
    Zem, Oct 6, 2005
    #1
    1. Advertisements

  2. Zem

    Conor Guest

    Shouldn't the information on the number of displays be contained in the
    x.conf/xf86config file?
     
    Conor, Oct 6, 2005
    #2
    1. Advertisements

  3. Zem

    Zem Guest

    I should've made my first posting more clear. The application I'm
    writing is trying to detect the number of display connectors on a card
    so that I can generate a valid xorg.conf file. This detection has to
    take place before the X server is configured and running. I was looking
    into the NV-Extension that Nvidia's nvidia-settings utiliy can control
    but this also requires that the X server be running. Anyone have some
    ideas on how to query nvidia cards before the X server has been
    configured and started? Thanks!
     
    Zem, Oct 7, 2005
    #3
  4. Zem

    Chuck Guest

    Don't know about current cards-- but, in the past this was next to
    impossible, because the card mfrs did not always make the card firmware
    exactly match the hardware. You had to make tests that would error out if
    hardware was not as expected.
    I suppose the problem might be even worse these days with all the hardware
    variations from one vendor to another, even if the same basic graphics
    chipset is used, and the card is a "reference design" card with left off
    parts.
     
    Chuck, Oct 8, 2005
    #4
  5. Zem

    Zem Guest

    makes sense, just because a gpu can support dual output doesn't mean
    that the card manufacturer will implement the card as a dual output
    card. I guess I'll go back to creating a lookup table that specifies
    the number of outputs each gpu can handle. Of course it doesn't make
    the design bullet-proof but it'll do for now.
     
    Zem, Oct 11, 2005
    #5
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.