After this lesson, you will be able to:Estimated lesson time: 25 minutes
Identify the different types of display adapters.
Understand display memory and how it affects quality and performance.
Select the right card for a monitor.
Evolution of the Display Adapter
The display adapter has gone through several major evolutions as the nature of PC computing has changed from simple word processing and number crunching, to the graphics-intensive world of Windows and multimedia.
The First PC Display Cards
The two "official" video cards for the early 8088-based IBM personal computers (the PC and XT) were matched to the limited capabilities of the early monitors. The Monochrome Display Adapter (MDA) offered a simple text-based monochrome display. This adapter produced an 80-character-wide row of text at a resolution of 720 x 350 pixels. Shortly after that, the Color/Graphics Adapter (CGA) card appeared. It provided up to four "colors" (actually, just different intensities of the monitor's active color: amber, green, or white). In four-color mode, CGA provided a resolution of 320 x 200 pixels. Using just two colors allowed a resolution of 640 x 200 pixels.
With the release of the Enhanced Graphics Adapter (EGA) card, the IMB PC AT became the first PC really able to use color. This adapter was an improved version of CGA, offering a top resolution of 640 x 350 with 16 colors in text-only mode, and 640 x 200 with two colors in graphics mode. The EGA also ushered in the era of video conflicts. It was not fully backward-compatible with CGA and MDA, and some programs would display improperly or even lock up the system. The MDA, CGA, and EGA cards all shared the same connection, a 9-pin d-shell male fitting.
The human eye can distinguish 256 shades of gray and about 17 million variations in color in a scene, the minimum required to produce true photographic realism on a screen. EGA did not even come close. It's aim was to offer the ability to incorporate color in pie charts and other forms of business graphics. Although the first graphics programs did arrive to make use of the EGA's graphics capability, serious computer graphics had to wait for better hardware.
Memory and the Arrival of the Display Coprocessor
A brief digression to explain pixel depth and video memory demands will help you understand what follows. Both the MDA and CGA adapters were equipped with 256 KB of DRAM (dynamic random access memory). The amount of memory on a display card determines the amount of color and resolution that it can image and send to the monitor. As the desire for better graphics and color displays increased, so did the complexity of graphics cards and with them, memory requirements and cost.
Remember that the image on the monitor is a collection of dots called pixels. Each image placed on the screen requires that code be placed in the adapter's memory to describe how to draw it using those dots and their position in the grid. The MDA cards featured a lookup table (LUT) for each character. For MDA adapters, a code number for that symbol and each position on the grid was stored in memory, and the card had a chip set that told it how to construct each of those items in pixels. The MDA and CGA cards each had 265 KB of memory, just enough to map the screen at their maximum resolution. That's why the CGA card had two different modes: the more colors used, the more memory was required. When it displayed four colors instead of two, the resolution had to drop.
The MDA card was a 1-bit device. In other words, each pixel used 1 bit, valued either 0 or 1 to represent whether a given position on the screen (a pixel) was on or off. To represent colors or shades of gray, a card must use memory to describe color and intensity. This attribute of the display, measured in bits, is known as color depth. Color depth multiplied by resolution (the number of horizontal pixels multiplied by the number of vertical rows on the screen) determines the amount of memory needed on a given display adapter.
The adapters that followed the EGA cards to market all offered more colors and, very quickly thereafter, higher resolution. That, in turn, required more processing. The MDA, CGA, and EGA cards all relied on the host computer's CPU. Although that was sufficient in the days before widespread use of graphical interfaces and lots of color, with the advent of the graphical user interface (GUI), all that changed.
The new generation of display cards started the practice of including their own display coprocessors on-board. Coprocessors, which have their own memory, are tuned to handle tasks that would usually slow down the PC, and many display cards use bus mastering to reduce the amount of traffic on the system bus and to speed display performance. Video coprocessing is also called "hardware acceleration." This uses one or more techniques to speed up the drawing of the monitor image. For example, one or more screen elements can be described without using calculations that have to determine the placement of every pixel on the screen.
These new graphics chips were designed to do one thing: push pixels to the screen as efficiently as possible. At first, the cards that used them were expensive and often prone to memory conflicts with the host CPU. Their growing popularity led to rapid advances in design. In the mid-1990s, a new graphics card was introduced on the market almost every day, and a new processor almost every ten days.
Today, high-performance graphics adapters are the norm. While there is no longer a mad rush to market, the graphics coprocessor is a key element of fast Windows performance. Next, we return to our review of standards and see how the industry progressed to today's world of high-speed, full-color computing.
The Advent of Advanced Display Systems
Graphics artists, engineering designers, and users who work with photorealistic images need more than a coarse, 16-color display. To tap into this market, which was using ,000 workstations, PC vendors needed more powerful display systems. IBM offered a short-lived and very complicated engineering display adapter, the Professional Graphics Adapter (PGA). It required three ISA (Industry Standard Architecture) slots, and provided limited three-dimensional manipulation and 60 frames-per-second animation of a series of images. It was also very expensive and a dismal failure in the marketplace.
The reason was the advent of the Video Graphics Array (VGA) standard. All the preceding cards were digital devices; the VGA produced an analog signal. That required new cards, new monitors, and a 15-pin female connector. It allowed developers to produce cards that provided the user with up to 262,144 colors and resolutions up to 640 x 480.
The VGA card quickly became commonplace for a PC display system, and the race was on to produce cards with more colors, more resolution, and additional features. VESA (Video Electronics Standards Association) agreed on a standard list of display modes that extended VGA into the high-resolution world of color and high photographic quality we know today. The standard is known as SVGA (Super VGA). The SVGA sets specifications for resolution, refresh rates, and color depth for compatible adapters. On Pentium and later PCs, an SVGA adapter is the standard for display adapters. The minimum resolution needed for SVGA compatibility is 640 x 480 with 256 colors, and most modern adapters usually go far beyond that.