EGA to VGA: The initiation of bit-mapped graphics and the chip clone wars
>> Electronic Design Resources
.. >> Library: Article Series
.. .. >> Series: The Graphics Chip Chronicles
.. .. .. >> Intel’s 82786
.. .. .. << NEC µPD7220
When IBM introduced the Intel 8080-based Personal Computer (PC) in 1981, it was equipped with an add-in board (AIB) called the Color Graphics Adaptor (CGA) (Fig. 1). The CGA AIB had 16 Kbytes of video memory and could drive either an NTSC-TV monitor or a dedicated 4-bit RGB CRT monitor, such as the IBM 5153 color display. It didn't have a dedicated controller and was assembled using a half dozen LSI chips. The large chip in the center is a CRT timing controller (CRTC), typically a Motorola MC6845.
Those AIBs were over 33-cm (13-in) long and 10.7-cm tall (4.2-in). IBM introduced the second-generation Enhanced Graphics Adapter (EGA) (Fig. 2) in 1984 that superseded and exceeded the capabilities of the CGA. The EGA was then superseded by the VGA standard in 1987.
But the EGA established a new industry. It wasn't an integrated chip; however, it's I/O was well documented, and it became one of the most copied ("cloned") AIBs in history. A year after IBM introduced the EGA AIB, Chips and Technologies came out with a chip set that duplicated what the IBM AIB could do. Within a year the low-cost clones had captured over 40% of the market. Other chip companies such as ATI, NSI, Paradise, and Tseng Labs also produced EGA clone chips, and fueled the explosion of clone-based boards (Fig. 3). By 1986 there were over two-dozen such suppliers and the list was growing. Even the clones got cloned and Everex took a license from C&T, so it could manufacture an EGA chip for its PCs.
The EGA controller wasn't anything special really. It offered 640 by 350 pixel resolution with 16 colors (from a 6-bit palette of 64 colors) and a pixel aspect ratio of 1:1.37. It had the ability to adjust the frame buffer's output aspect ration by changing the resolution, giving it three additional, hard-wired, display modes: 640 by 350 resolution with two colors, with an aspect ratio of 1:1.37, 640 by 200 pixel resolution with sixteen colors and a 1:2.4 aspect ratio, and 320 by 200 pixel resolution with sixteen colors and a 1:1.2 aspect ratio. Some EGA clones extended the EGA features to include 640 by 400 pixel resolution, 640 by 480 pixel resolution, and even 720 by 540 pixel resolution along with hardware detection of the attached monitor and a special 400-line interlace mode to use with older CGA monitors.
The big breakthrough for the EGA, and why it attracted so many copiers was its graphics modes were bit-mapped planar, as opposed to the previous generation interlaced CGA and Hercules AIBs. The video memory was divided into four pages (except 640 by 350 pixel resolution with two colors, which had two pages), one for each component of the RGBI color space.
Each bit represented one pixel. If a bit in the red page is enabled, and none of the equivalent bits in the other pages were, a red pixel appeared in that location on-screen. If all the other bits for that pixel were also enabled, it would become white, and so forth.
The EGA moved us out of character-based graphics and into true bit-mapped, based on a standard. Similar things had been accomplished with mods to micros computers such as Commodore PET and Radio Shack TRS80, and direct from the manufacturer of IMSI and Color Graphics, but they did not use an integrated VLSI chip. The EGA was the last AIB to have a digital output, with VGA came analog signaling and a larger color palette.
EGA begets VGA to XGA
With the introduction of the IBM PC, the personal/micro and even workstation-class graphics got a new segment or category - consumer/commercial. The users in the commercial segment were not too concerned with high-resolution, and certainly not graphics performance. Certain users of spreadsheets liked higher resolution, and a special class of desktop publishing had a demand for very high-resolution. But the volume market was commercial and consumers. Even that segment was subdivided. A certain class of consumers, gamers, did want high-resolution and performance, but wouldn't pay the price the professional graphics (i.e., workstation) user were being charged.
PGA
The NEC 7220, and Hitachi 63484 ACRTC discussed in previous Famous Graphics Chips articles went to the professional market. IBM, the industry leader and standard setter recognized this and in the same year it introduced the commercial/consumer class EGA it also introduced a professional graphics AIB the PGA. The PGA offered a high resolution of 640 by 480 pixels with 256 colors out of a palette of 4,096 colors. Refresh rate was 60 Hz. Like the EGA, the PGA was not an integrated chip.
8514
IBM discontinued the PGA in 1987, replacing it with the much higher resolution 8514, and breaking with the acronym description of AIBs. The 8514 could generate 1024×768 pixels at 256 colors and 43.5 Hz interlaced. The 8514 was a significant development, and IBM's first integrated high-resolution VLSI graphics chip. The 8514 will be discussed in a future article and is mentioned here for chronological reference.
VGA
IBM's video graphics array was the most significant graphics chip to ever be produced in terms of volume and longevity. The VGA was introduced with the IBM PS/2 line of computers in 1987, along with the 8514. The two AIBs shared an output connector which became the industry standard for decades, the VGA connector. The VGA connector was among things, the catalyst that lead to the formation of the Video Electronics Standards Association (VESA) in 1989. This too is a significant device and will be covered separately and is listed her to show the complexity of the market at the time and how things were rapidly changing.
Summary
The EGA was really the foundation controller, and later the chip of the commercial and consumer PC graphics market.
By 1984 the computer market had consolidated to two main platforms, PCs, and workstations. Microcomputer had died off in the early 1980s due to the introduction of the PC. Gaming (also called video) consoles stayed as living room TV-based devices, and big machines called servers were replacing what had been mainframes. Supercomputers were still being produced at the rate of three or four a year. All of those machines used some type of graphics, and a few graphics terminals were still produced to serve the small but consistent high-end markets. However, by 1988 they all used standard graphics chips, sometimes several of them (Fig 4).
The EGA specification was the catalyst for the establish of some, and the increased success of other companies. One such company, AMD, is still with us, having acquired pioneer graphics company ATI (and a EGA clone maker).
>> Electronic Design Resources
.. >> Library: Article Series
.. .. >> Series: The Graphics Chip Chronicles
.. .. .. >> Intel’s 82786
.. .. .. << NEC µPD7220