Gigabyte GV-N96TSL-1G

Discussion in 'Gigabyte' started by Bill, Dec 17, 2009.

  1. Bill

    Bill Guest

    I was considering the fanless Gigabyte GV-N96TSL-1 graphics card in on a
    Gigabyte GA-P55A-UD4P motherboard. This board supports (future technology)
    features USB 3.0 and 6.0 GB/s SATA, however if either of these features are
    used, the first PCI Express slot goes from 16X to 8X. My question is
    whether this reduction (to 8x) would be expected to affect the graphics
    performance of the system. I was thinking that maybe since it is a
    relatively "slow" card, that it might not--but I really have no idea. Thank
    you for sharing whatever thoughts you may have concerning this.

    Bill
     
    Bill, Dec 17, 2009
    #1
    1. Advertisements

  2. Bill

    Bill Guest

    I should have also mentioned that I was planning to install an Intel 860 CPU
    in the system. (with no overclocking).
     
    Bill, Dec 17, 2009
    #2
    1. Advertisements

  3. Bill

    Paul Guest

    If the slot ran x8 PCI Express Rev2, that is 8*500MB/sec or 4GB/sec.
    That is roughly equivalent to twice what you'd get with AGP 8x.

    The older generation PCI Express Rev1.1 x16 slot, would have given
    you 16*250MB/sec or 4GB/sec as well. So x8 operation in Rev2 mode,
    is still pretty good, and comparable to x16 in Rev1.1 mode.

    If it were to have an impact, which I doubt, it would be at the 5%
    to 10% level while gaming.

    Tomshardware did some tests years ago, where they used cello tape,
    and insulated various numbers of PCI Express lanes. You can use those
    results, to understand the shape of the performance curve. The
    effects are worst for one particular kind of benchmark, and not
    nearly as pronounced on real games.

    (SpecViewPerf suffers, when PCI E is slowed down)
    http://www.tomshardware.com/reviews/sli-coming,927-9.html

    Such a set of test cases, would need to be repeated for the
    more powerful processor and GPU combinations available today.
    I can't guess at what the performance curve would be. The impact
    should be pretty small, but only a real benchmark series, such
    as the Tomshardware article, is needed to be sure.

    *******

    On the architecture front, the reason your question is intriguing, is

    Why would "Northbridge" PCI Express interfaces, have any
    relationship to what is done on the "Southbridge" ?

    I downloaded the manual for your motherboard, and I do see the section
    in question. I don't doubt there is an issue there.

    ftp://download.gigabyte.ru/manual/mb_manual_ga-p55a-ud4(p)_e.pdf

    I also have a copy of the P55 ("Southbridge") spec 322169.pdf, and
    what is interesting in there, is the chip seems to have integrated
    clock generation. That might not be the only way to do it.
    It may be possible to use an external clock generator. My
    guess is, that they're using the integrated clock generation.
    That saves money.

    The P55 spec is 892 pages long, and I'm not going to read the whole
    thing. Even if I was paid to do it, there wouldn't be enough hours
    in a day, to read the whole thing, look for every "*" or "Note" in
    the document, and figure out what evil they're up to. I was not
    able to find a reference to a register controlling clock generation,
    due to the limits of the Adobe Acrobat version 9 PDF reader
    (piece of crap). I wish Intel would use an older version of
    PDF compatibility, so I could use an older version of Acrobat.

    The P55 has two PCI Express Rev2 compliant clock outputs (150pS
    jitter spec). I can see one output going to PCI Express slot 1.
    The second clock output would go to the PCI Express switch chip,
    which routes the remaining x8 of bandwidth, to either the first
    or second video card slot. Maybe the switch chip makes more
    outputs ? We don't even know what chip is used.

    Great, we have PCI Express Rev2 video slots, and PCI Express Rev1
    Southbridge PCI Express interfaces.

    Now, when Gigabyte wants to run the add-on peripheral chips with
    PCI Express Rev2 compliant speeds, it needs the low jitter clocks
    for that. Where the hell are those clocks coming from ? Perhaps
    it is the lack of good quality clock signals, that causes this
    limitation, and interaction between Northbridge (Video) and
    Southbridge (Peripheral) PCI Express interfaces. I doubt it
    very much, that the PCI Express switch chip, is being used to
    supply both video and peripherals at the same time - the Gigabyte
    architecture diagram in the manual seems to discount that.

    Very peculiar... and sucky.

    I wonder how much more it would have cost, to use an external
    clockgen, or if it is even possible ?

    I don't know why this interaction exists, but it could be
    because of Intel's half baked built-in clock generator.
    Anyone who has worked out clock distribution architectures
    on a PCB, knows that additional clock outputs are golden,
    and allow amazing things to be done. Cheap out on them,
    and some poor PCB designer will be sweating gumdrops,
    trying to make their design work. At the moment, I don't
    even know how Gigabyte managed to do what they've done.

    There are devices, that allow buffering and creation of
    more clock signals. But once you use such a device, you
    degrade the clock quality. That is why it isn't a trivial
    matter to solve.

    *******

    I think in the Tomshardware article, you can see it takes
    a pretty serious degradation of the video slot bandwidth,
    before it ruins your video performance. In your case,
    I wouldn't lose any sleep over it. However, if I bought
    a $600 video card, and it removed even a few percentage
    points from it, I'd be pissed - because I want to get my
    $600 worth of performance.

    Paul
     
    Paul, Dec 17, 2009
    #3
  4. Bill

    Bill Guest


    Thank you your very detailed reply, and the link to TomsHardware article.
    As you put it, "I'm not going to
    lose any sleep over the 8x issue". From what I have read, the reason the
    board works the way it does is becuase of the P55 chip.
    Evidently there isn't anyway to overcome the shortcoming I mentioned (going
    to 8x) because of that chip--if one wants more, then they need to spend a
    little more and go to the X58 chip... I don't do any serious gaming.
    Occasional adventure game, Google Sketchup. Appears I should be okay with
    the components I mentioned. I was sort of waiting to see how the Intel
    X25-M SSD /TRIM issues played out, and I haven't heard much lately--which I
    guess is a good thing.

    Peace,
    Bill
     
    Bill, Dec 17, 2009
    #4
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.