This article is part of the Electronics History series: The Graphics Chip Chronicles.
In 1983, NCR formed a six-person team to design a graphics controller that could be used inside and outside the company. The engineers designed a two-chip approach: a graphics controller and a memory interface controller tied together by a propriety pixel bus. They incorporated a frame buffer and text by using a dual-text mode technique, allowing for the generation of ASCII text via a cache-memory drawing technique using variable memory-word width with a two-stage color lookup table.
The team chose the Graphics Kernel System (GKS) and embedded 25 GKS graphics, timing, and control commands in the 7300 graphics controller. NCR’s midrange chip punched above its weight class, as it could drive a 1024×1024 display with an eight-bit frame buffer, run at an impressive 3.75 MIPS, and drive displays at 30 MHz.
However, NCR was the only major customer for the graphics chip, and that wasn’t enough to sustain development. The 7300 became the 77C32BLT and was used on Puretek graphics AIBs through 1987.
First, the Backstory of the NCR 7300
In the 1960s, NCR was a multibillion-dollar computer systems and semiconductor manufacturer. It established its microelectronics laboratory in 1963 in Colorado Springs, Colo., to stay abreast of the emerging semiconductor industry. By 1968, NCR’s first MOS circuits had been produced, and by 1970, a complete family of circuits had been designed and incorporated into NCR products.
By the mid-1980s, the PC market was split into several distinct segments: entry-level systems were used by clerks and for simple data entry, consumers bought mid-range systems, while gamers and power users shelled out for high-end systems. Above that was a niche segment—the workstation. There were lots of opportunities, differentiation, and challenges. APIs were in flux, the bus wars raged, and displays expanded in resolution. Only eight graphics controller suppliers existed at that time, but the number swelled rapidly and would hit 47 at its height.
It was a perfect time to enter the market, grow with it, and help shape it. And with a recognizable and respected brand, NCR’s entry seemed like a slam dunk.
In 1983, NCR pulled together a small team and charged them with designing and producing a graphics controller that the company could use internally for its own products and that other companies could buy. The team’s biggest challenge was picking the segment and specifications. PCs were just getting bit-mapped graphics, a more extensive color range was available, while text was still essential. Somehow all of these factors had to be handled simultaneously and efficiently.
Development costs were steep and chip manufacturing was expensive. Ultimately, the team decided to shoot for the midrange to get the best return on investment (ROI).
A Dual-Chip Graphics Processor Shapes Up
Very large-scale integration (VLSI) was still in its early stages, and there was only so much that was practical to put in a single chip. Therefore, the team decided on a dual-chip strategy, tying a graphics controller (the 7300) together with a memory interface controller (7301) using a propriety pixel bus with a clock (Fig. 1).
The team was able to combine graphics features incorporating a frame buffer and text by leveraging a clever dual text mode technique. It made it possible to generate ASCII text via a cache-memory drawing technique using variable memory-word width with a two-stage color lookup table.
Since there were not many industry standards at the time, the team chose the Graphics Kernel System from the American National Standards Institute’s (ANSI) standard introduced in 1977. The team was able to embed 25 GKS graphics, timing, and control commands in the 7300, more than the proprietary Hitachi’s HD63484 or NEC’s 7220. That gave the 7300 built-in compatibility with any software applications that used GKS and the PC. NCR also developed a driver that could run under Unix or DOS.
At the time, text operations still dominated, and many graphics chips incorporated a character generator to speed up text generation. NCR put the character generator in an unused section of the frame buffer, a technique used in CAD applications for display lists. Placing the character library in the frame buffer as bitmaps gave the user a scaling capability for the character size. Those bitmapped characters could then be accessed (using 8-bit codes), eliminating the bottleneck of passing images between the host and the 7300.
The 7300 had independent first-in, first-out (FIFO) input and output registers. Both were 16 bits wide and 16 words deep. Its internal architecture was 8-bit, but the interface logic could work with 8- or 16-bit processors.
Its FIFO registers were used to transfer pixel data and commands. The controller used the command register to specify the bus’s data width (8 or 16 bits) and handle the initialization handshake signals for DMA transfers. The status register held the FIFO register status and indicated when the output FIFO register was empty. Figure 2 shows the generalized block diagram of the 7300 controller.
The 7300 featured an on-chip programmable color lookup table with overlay mode, a feature usually found in separate LUTDAC (lookup table, digital-to-analog converter) chips. NCR’s 7300 preceded the popular Acumos and Cirrus Logic GD5400 series, which both introduced the same feature in 1991. On top of that, the industry-first 8-bit frame buffer could drive a 1024×1024 display, a feature that you would generally find in workstation-class chipsets.
The 7300/7301 chipset was built in a 2000-nm dual polysilicon NMOS process—state-of-the-art for the era. Despite the relatively conservative design, the controller could run at an impressive 3.75 MIPS and drive displays at 30 MHz.
At the time, the company said it planned to integrate the 7300 and 7301 into a single chip—which would have been the logical thing to do—and in 1987, filed a patent. But NCR was the only major customer for the chip, and that wasn’t sufficient volume to sustain the development.
The number of competitors entering the field was increasing rapidly, and prices were dropping just as fast, as too many companies chased too few customers. The 7300 became the 77C32BLT and was used on Puretek graphics add-in boards (AIBs) through 1987. It was also used on MacroSystem’s Altais AIB for the DraCo Computer (Amiga 060 Clone).
Single-chip designs from the competition outpaced and outpriced NCR, and it quietly withdrew from the market.
NCR: Then and Now
NCR was a significant computer supplier providing everything from PCs to workstations, minicomputers, and mainframes. By 1986, the number of U.S. mainframe suppliers shrank from eight (IBM and the “Seven Dwarfs”), to six (IBM and the “BUNCH”), and then to four: Control Data Corporation, IBM, NCR, and Unisys.
In September 1991, AT&T acquired NCR for $7.4 billion, and by 1993, the subsidiary had a $1.3 billion loss on $7.3 billion in revenue. The losses continued, despite the fact that AT&T was NCR’s single largest customer, accounting for over $1.5 billion in revenue.
AT&T sold the NCR microelectronics division in February 1995 to Hyundai. At that time, it was the largest purchase of a U.S. company by a Korean company. Hyundai renamed it Symbios Logic.
In 1996, following a spin-off of the NCR group, NCR re-emerged as a standalone company. Today, NCR is a large enterprise technology provider for restaurants, retailers, and banks. It’s the leader in global POS software for retail and hospitality, and it provides multivendor ATM software.
Read more articles in the Electronics History series: The Graphics Chip Chronicles.