1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

if 32-bit color is, in fact, 24-bit then why does w2k have an option for 24-bit color?

Discussion in 'Nvidia' started by Doug, Jul 26, 2005.

  1. Doug

    Doug Guest

    My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024 she
    can't get 32-bit color only 24-bit color (the color quality selection in
    Display Properties). But if 32-bit color is actually 24-bit color then there
    should be no difference?
     
    Doug, Jul 26, 2005
    #1
    1. Advertisements

  2. Doug

    stevem Guest

    There will be no difference in the colour rendition, but 24-bit is slower
    than 32 bit. This is because a memory 'fetch' is done in word (i.e. 32-bit)
    chunks. When you've fetched the first 32-bit word, if you're running in
    32-bit mode, then that's that; the required 24-bit value is there,
    right-aligned in the word (i.e. the first 8 bits are zero). Subsequent
    word-fetches bring in the next values, etc. In the case of 24-bit colour,
    the fetch is still 32-bit, but the 24-bit value you want is left-aligned in
    the word, so the logic has to shift the value right, dropping off the bottom
    eight bits. Instant problem; those eight bits you've just 'dumped' are in
    fact the first 8 bits of the NEXT 24-bit value, so you've got to store them
    somewhere! Then, you bring in the next 32-bit word; the first 16 bits are
    the right-most 16 bits of the next pixel (plus the 8 bits you just stored
    earlier), and the next 16 bits are the first 16 bits of the third pixel, and
    so on. That's why 32-bit is far preferable to 24-bit.
    Regards,
    Steve.
     
    stevem, Jul 26, 2005
    #2
    1. Advertisements

  3. Doug

    First of One Guest

    Often it's just a matter of semantics with the drivers. With my Radeon
    9800Pro driver, the 24-bit option does not exist, only 16- and 32-bit.
     
    First of One, Jul 27, 2005
    #3
  4. Doug

    Doug Guest

    If it's just a matter of semantics why does the documentation for the Matrox
    Mystique state certain high resolutions are only available in 24-bit color
    while all lower resolutions are available at 32-bit color? If there's no
    difference whatsoever then why would they bother stating this? I've got
    another PCI card that has the same limitations in its technical
    documentation (i.e. that it can't get 32-bit color in high resolutions).
     
    Doug, Jul 27, 2005
    #4
  5. Doug

    Kevin Steele Guest

    Doug thought about it a bit, then said...
    There is a difference, but it's not normally used: the extra 8-bits in
    32-bit color are used for transparency effects (or gamma, I can't recall
    exactly).

    In other words, 32-bit color is basically 24-bit color with the extra 8-
    bits used for special effects. At least in that regard it is different
    from 24-bit color.
     
    Kevin Steele, Jul 27, 2005
    #5
  6. 24-bit color does use less RAM than 32-bit, there may not be enough
    video RAM for the card to do 1280x1024 in 32-bit color.
     
    Robert Hancock, Jul 28, 2005
    #6
  7. Doug

    Doug Guest

    That's what I assumed, but I've been reading on this very newsgroup that
    32-bit color is 24-bit color and there's no difference between them but
    there MUST be otherwise why would these video card manufacturers bother
    stating that certain resolutions can't be had in 32-bit color.
     
    Doug, Jul 28, 2005
    #7
  8. Doug

    stevem Guest

     
    stevem, Jul 28, 2005
    #8
  9. The difference is that 32-bit trades off more video RAM usage for more
    performance. The display quality is identical.
     
    Robert Hancock, Jul 30, 2005
    #9
  10. Doug

    Doug Guest

    I remember color pallettes for Photoshop allowed for a larger selection
    w/32-bit color. I'm beginning to NOT beleive everyone who says 24-bit color
    is the same as 32-bit color. If that were the case then what would be the
    POINT of having a 24-bit color mode for various video cards? Unless someone
    here has the technical background/experience or proof to back up their
    statement that 24-bit color is the same as 32-bit color I'm just going to
    assume it's unsubstantiated bullshit.
     
    Doug, Jul 31, 2005
    #10
  11. Doug

    Arthur Hagen Guest

    The point of having 24-bit colour is when you have a CPU or GPU that
    doesn't pay big penalties for accessing memories at a byte boundary, or
    where swab and shift operations are fast and video memory is cache
    mapped. In that case, you use 25% less video memory. Remember that all
    video cards weren't originally made for the latest Pentium class PCs.

    Also, some video solutions didn't store pixels, but bitplanes, in which
    case it doesn't make sense to add more bitplanes than you need. This
    has the advantage of super-fast blits (moving objects), but the
    disadvantage of not being able to write in just one place to set a
    single pixel.

    32-bit is /almost/ always padded 24-bit, but the extra bits /can/ be
    used for other purposes, like an alpha channel (which is a "dimmer" --
    instead of dividing all pixel valuess by four to get part of an image at
    25% intensity, you can set the "alpha" byte to 63 (25% of 255) to
    achieve the same), or special modes like the "Gigacolor" of the Matrox
    Parhelia, which uses 10 bits per pixel instead of 8.

    However, in most cases, 32-bit is really just padded 24-bit due to it
    being faster for a modern CPU/GPU to read a 32-bit longword than three
    bytes.
     
    Arthur Hagen, Jul 31, 2005
    #11
  12. Doug

    at Guest

    32 bit colour is made up of 8 bits Red, 8 bits Green, 8 bits Blue, and 8
    bits Alpha (transparency). For 2D, generally speaking you don't need the
    alpha bits so you can get away with using only 24 bits.

    Image editing programmes may allow for more than 8 bits per colour
    channel (8 bits gives 256 shades per channel), but at the end of the day
    they will only be displayed on the monitor at 8 bits per channel.

    Scanners are also scanning at higher precision per colour channel (e.g.
    my scanner is capable of something like 48-bit, or 16 bits per colour
    channel) but again, only up to 8 bits per channel are displayed on the
    monitor.
     
    at, Jul 31, 2005
    #12
  13. Doug

    J. Clarke Guest

    What you see on the screen is usually not different. Note _usually_. Not
    the same as _always_. The amount of memory used to store the frame,
    however, _is_ different. The limitation is usually lack of RAM.
     
    J. Clarke, Aug 16, 2005
    #13
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.