HoMe

16bit vs 32bit.... 
Logic may tell you that 64,000(16bit) colors is more then enough to keep your eyes satisfied, but in some cases, it's not. The problem is that there is a set pallet of colors that can be used. When you use 16bit color depth, and fill a long space with a gradient of 2 low-contrast colors, you'll see the image is segmented, like so:

16bitA.gif (1964 bytes)

This negative effect is also called "Banding". It occurs because the shades that "connect" those 2 colors are not included in the 64K pallet. That transition sometimes even creates unrelated colors, or at least so will your eyes tell you. If you look at the image above, you'll see that the segments have a non-linear cycling. To your eye, the colors above aren't in a smooth gradient, and not only that, but you may think that certain segments aren't where they should be. This however, can be solved by "Color Error Diffusion", or "Dithering". Take a look at the image below:

16bitB.gif (4336 bytes)

That image has the exact same number of colors used in the previous image - 21. However, the image that was diffused looks much more like the 24bit image, which has 145 colors in it. This is because the colors were purposely "bled" into each other. This means that pixels from each adjacent segment were planted in its nearest segments, sort of like mixing coffee grains with sugar grains - even though there are only 2 colors, you'll see different shades of brown according to the amount of coffee/sugar you used. However, to stretch that metaphor, if you look closely at the grains, you'll see it's not a single color - just like with resolutions. If the resolution is high, the color bleeding won't show as much. If the resolution is low, it will become more apparent. The problem with this, is that when you diffuse a texture you plan to use in a game, it will look very odd when the game's engine renders it from too near, or too far. Thus, the diffusion has to be done in the rendering engine itself, not the textures. The good news is, that most rendering engines do use color dithering, which means that if the resolution is high enough, there shouldn't be a problem with using 16bit in games. The bad news is, that in real life, unless the rendering engine(in this case the rendering card) initially renders the image in 24Bit, the color dithering won't be very effective when used on large spaces. If a 3D scene has too much of the same shade, or if the scene's background uses a large gradient 24Bit texture, then 24bit will give you quite alot of difference.

Here is the 24bit version:

32bti.jpg (844 bytes)

It contains 145 colors, and looks as it should - a perfect gradient.

So what's "32bit"? 32bit gives the exact same colors as 24bit will give you. What you get is an "Alpha" layer. That layer gives the pixels certain translucencies, so instead of the pixel completely covering the pixel behind it, it changes it to a different shade closer to its own. Sort of like looking through stained glass.

Bottom line, having to use 16bit isn't so bad if the game was designed for it. If there is enough color diversity and the engine is created correctly, then the loss wouldn't be noticeable. However, 32bit will always look better, no matter how little the difference - Question is, is it worth the performance drop?...
 

My URL: http://zap.to/voodoo3 and http://fly.to/mikesgames and http://www.geocities.com/SiliconValley/Hills/5069

V3-URL add
I got it for free at
http://come.to