1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

HDMI quality on video cards?

Discussion in 'Nvidia' started by Rats, Apr 10, 2007.

  1. Rats

    Rats Guest

    I am looking at building a dedicated media centre to play my xvids.
    I'll either be using a projector or a 42"+ LCD/Plasma screen for

    Can someone suggest the best method for feeding the signal to the
    projector or screen? I am thinking along the lines of HDMI as this is
    the newest technology. Can anyone vouch for the quality? Would the
    image appear as crisp as it would, say, on an LCD monitor?

    Rats, Apr 10, 2007
    1. Advertisements

  2. Rats

    Paul Guest

    Perhaps a forum like this can help.

    "Clarifications on HDMI cables"

    The answer could be quite complex, depending on how much gear you own.
    The math statement at the bottom of this thread looks to be a
    reasonable one, from a theoretical point of view:

    "HDMI > Component > S-video > etc."

    "Whats the Best Way to Connect All these Home Theater Components?"

    Paul, Apr 10, 2007
    1. Advertisements

  3. Rats

    Rats Guest

    Oops. I think I may not have phrased my question properly. I simply
    want to know the best way of outputting a signal from a video card (in
    this case I am inclined towards NVIDIA hence my post in this NG) to my
    projector/plasma screen.

    In the past I've used the old s-video out and the quality's been poor
    at best. I've seen slightly better results with scart out connections
    on tv cards and still better quality on a direct vga plug into

    I have not yet witnessed the output from DVI or HDMI and was wondering
    how it compared to its predecessors.
    Rats, Apr 10, 2007
  4. Rats

    Paul Guest

    HDMI or DVI are digital. If there is a problem transmitting a signal, the
    result is "snow" on the screen. But otherwise, the transmission method is
    "lossless". The byte value calculated by the GPU, is exactly the same
    when it is clocked into some digital circuit in your projector. So the
    quality is not affected by the transmission method (up to the point that
    the signal is no longer sufficient to be clocked by the receiver chip).
    That is unlike the other, analog methods, as those will lose something
    along the way, from video card to receiver.

    When you use a VGA connector, especially at high resolution, the sharpness
    of the image can be compromised. A video card with too much EMI filtering
    just before the connector might only give a sharp picture at one of the
    lower resolutions. A really long cable might soften the image as well.

    S-Video has a bandwidth of about 4 MHz, which means text won't be readable,
    even at 640x480.

    So at least in my own case, S-Video would be my last choice. If VGA or
    HDMI/DVI are options with the projector, I'd try those before using
    S-Video. But with all the devices you have at your disposal, and the
    limited number of inputs on the projector, you have to pick and
    choose, which device will be using which port.

    Paul, Apr 10, 2007
  5. Rats

    Rats Guest

    Brilliant. Thanks Paul.
    Rats, Apr 10, 2007
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.