In color mode, an 8-bit card can't display all the colors in a full-color picture, so a LUT is used to figure the closest match to a hue that can't be represented directly. Although this method isn't ideal, it was for several years the state of the art on desktop PCs. Then, early in the age of the 486 processor, came the 16-bit SVGA card, which allowed approximately 64,000 colors. More bits require more memory, more processing requirements, a bigger LUT, and more money. These cards were tuned to be used with bigger monitors, 15 to 17 inches at 800 x 600 or 1024 x 768 resolution. The new systems were too expensive for average users, but graphics professionals and power users generated a large enough market to fuel development.
Short-Lived Standards
During the early days of VGA and SVGA, three other graphics-card standards were introduced by IBM for the PS/2. Although they never gained significant market share or full support among adapter developers, they did increase the demand for higher resolution and faster performance. The following list presents the highlights of the evolution of PC graphics standards:
- 8514/A, a 256 color competitor to VGA, with some hardware acceleration capability, offered 640 x 480 resolution in noninterlaced mode, and 1024 x 768 resolution at 43.3 Hz in interlaced mode.
- XGA, the eXtended Graphics Array, offered a resolution of 1024 x 768 in 8-bit (256-color) mode, and 640 x 480 in 16-bit mode (high-color). It came with 1 MB of memory and limited bus mastering.
- XGA/2 boosted the high-color mode to 1024.768 mode and increased the available refresh rates.
True Color Arrives
The SVGA adapters were a stepping stone; the growing popularity of Microsoft Windows and scanners pushed the demand for cards that could deliver color of photographic quality. In the early 1990s, several manufacturers introduced add-on cards that could be attached to SVGA cards to deliver 16.7 million colors. Soon after, stand-alone products that offered both SVGA resolution and true-color operation arrived. These adapters, known as true-color or 24-bit displays, come with coprocessors, lots of memory, and in true color mode have 256 shades (8-bits) available for each of three colors: red, green, and blue. By mixing them, the system can display 224 colors. Eight bits are used in each of the three color channels. Some monitors use traditional 15-pin cables, and some use BNC bayonet cables, with a separate cable for each RGB color, and one each for vertical and horizontal synchronization. The latter are found on many high-performance systems.
True-color cards originally sold for $3,000 or so, but within two years were under $800; now they are available for $150 or less. To add value, the better cards now have TV output ports that send a National Television Standards Committee (NTSC) signal that can be used to record images from the monitor onto a VCR or TV set. Multimedia cards are equipped with a TV tuner, letting the owner view TV programs on the monitor, or watch DVD (digital video disc) movies on a PC. One reason for the dramatic lowering of prices and added features stems from the mass production of the coprocessors, which reduced their cost to the manufacturer; another factor is the decreasing cost of video memory.