From CAD To CAE To EDA, Design Tools Have Wrestled With Complexity
Since its beginnings in the early 1970s, the design automation industry has climbed up the complexity scale in the semiconductor world. The design process has seen a steady progression in its automation along the way, with each step overcoming some design issue while also spawning even greater complexity and new challenges. Business models have come and gone, expansions and contractions have taken place, and the EDA industry now faces another series in the great sea of changes.
Among the key drivers that have pushed evolution in design tools, complexity is most critical. In 1965, Gordon Moore predicted that over the following 10 years, the number of transistors per integrated circuit would double every 18 months. That prediction has held up for 27 years longer than Moore himself forecasted (Fig. 1).
A parade of microlithographic innovations has made today's vastly complex ICs possible in the physical realm. But without the design automation methodologies that have been shaped since Fairchild, Motorola, and Texas Instruments (TI) employed legions of draftsmen to produce layouts for TTL building blocks, today's SoC designers would be out of luck.
Multiple factors have shaped the path of design automation and the ways that it and the design process have influenced each other. Process technology was the most important, followed closely by the availability of increasingly powerful computing resources. Closely coupled to the latter was the dispersal of those resources from mainframes to the engineer's desktop.
Consider how design was approached before commercial design automation existed. "I can still remember using drafting boards and cut rubylith," says Ted Vucurevich, senior vice president and chief technology officer at Cadence Design Systems Inc. "That was the state of the art in the early 1970s." Many IC designs were quite small, essentially comprising TTL MOS building blocks. The entire design process was done literally by hand and, at least at the Big Three of Fairchild, Motorola, and TI, in a completely vertically integrated and proprietary fashion. Any design automation was homegrown.
Wally Rhines, CEO of Mentor Graphics Corp. today, remembers those days from his vantage point as head of the group designing consumer products at TI. Rhines recalls, "TI used its proprietary design resources to set up what was, in effect, a 'productization machine.' They realized that the one who would win in TTL was the one who got the most part types out and qualified and sampled the fastest."
To achieve this goal, TI developed a system to quickly design functions, generate masks, characterize the devices, and market them. According to Rhines, that was called the TI Layout and Edit (TILES) system. Schematics were created by hand and simulated in Spice, a relatively new idea itself.
In 1972, the Spice 1 simulator was released into the public domain by the University of California at Berkeley Group, led by Prof. Donald Pederson. It was quickly seized by the semiconductor companies of the day, which each tweaked and customized it to its liking.
"Verification started with Spice and other applications like it," says Raul Camposano, chief technology officer at Synopsys Inc. "They realized that you could simulate electrical circuits to a great degree of accuracy, which would help a lot in constructing them and making sure they would work the first time."
In those early days, TI and others ran their simulations on the IBM mainframes that had become common on corporate campuses. "At its peak, we kept an entire 3090-600 mainframe running just simulations for semiconductors at one point. It was that big a deal," Rhines says.
Another piece of the automation puzzle from the early 1970s was the emergence of dedicated CAD systems from vendors such as Applicon and Calma for mask production. Automated pattern generation came into vogue as designs grew larger. "People realized you couldn't do these designs by hand anymore, just by drawing them and cutting the rubylith," says Camposano. "It would be much more flexible to have a database in which you could store the patterns. You'd have a digitizing system and to make a change, you could just go into the database and change it."
At TI, the capture process largely took place on proprietary systems, but some moved over to the Calma system based on Data General 32-bit minicomputers. These systems had dedicated operators called layout people. Rhines recalls a clear division of labor where engineers performed circuit design and simulation, but the draftsmen did layout.
Then, technicians would digitize the design from a version drawn on draft paper, using a handheld interface that would enter the coordinates for each of the geometric patterns into the Calma database. Of course, this created significant opportunity for error.
What we now know as physical design verification consisted of taking flatbed plots of the layouts, pinning them on the wall or laying them on a light table, and having people try to find errors. Hence, physical verification was one of the first businesses to be adopted in the emerging custom design space (see "The More Things Change, The More They Stay The Same,").
By the late '70s and early '80s, the practice of using mainframe computers for engineering began to break down. The Applicon and Calma workstations had arrived on the mask-generation side. Next to seeing a computing change was simulation and verification, with the introduction of Apollo workstations and the VAX and Data General systems. "As the engineers tried to use the IBM mainframe, they found that during the last two weeks of the quarter, they didn't get any work done because the corporate financial people were trying to do the close," Rhines says.
The rise of a distributed computing model for engineers brought a new breed of commercial design automation. Functional verification at the transistor level with Spice was growing unwieldy and too slow for digital circuits. Hence, logic simulation was born. "The fact that most of the delay was in the devices themselves allowed them to come up with a model for performance and function," Vucurevich says.
Logic simulation came into prominence with new vendors like Daisy Systems. Specializing in design capture and front-end verification, the company led the way.
Daisy, along with fellow newcomers Mentor Graphics and Valid, dominated design automation in the early 1980s as full-custom design methodologies took hold. Initially intended for pc-board design, these turnkey systems found applications in IC design as well. Daisy and Valid plied the path of proprietary hardware while Mentor went with Apollo workstations, which ultimately turned out to be the correct choice (Fig. 2). The workstation-based systems represented a unification of design capture, simulation, layout, and verification on one platform in one package.
The 1980s turned into the decade of back-end automation. As layout had become onerous with growing circuit complexity, so did the placement of circuit elements and routing of wires. There simply were too many elements to physically place them by hand, making tools essential.
Companies like Solomon Design Automation, the first tool vendor to bring to market software-only design tools for nonproprietary hardware platforms, started touching on ideas coming out of UC Berkeley regarding common data models. Other early place-and-route tools included Silvar-Lisco, ECAD, and Daisy Systems.
Simultaneously, design methodologies began to change. The movement was actually produced by a triangle between the design automation industry, the semiconductor vendors, and designers themselves.
The first aspect of this revolution was the move to cell-based design based on the work of Carver Mead and Lynn Conway in the late 1970s. Earlier methodologies centering on programmable logic arrays had failed to scale adequately and would give way to standard-cell design, launched commercially by VLSI Technology in 1982.
For one thing, the higher ab-straction of standard cells provided designers with a way to see more of their design at a given time. It let design be thought of at the gate level, and it spawned the development of standard-cell libraries.
Gate arrays came shortly after the standard-cell methodology appeared, offering faster fabrication. LSI Logic introduced its first gate-array router in 1982.
Also, logic simulation became a popular means of functional verification. With the Apollo workstations of the day gaining more power, tool vendors like Mentor Graphics began shipping gate-level simulators that were fast enough to handle circuits of a reasonably large size.
The turning point for the growth of commercial design automation, or computer-aided engineering (CAE) as it was then known, was the advent of the ASIC. Early on, ASIC vendors such as LSI Logic and VLSI Technologies made most of their money selling proprietary tool sets. "The independent, third-party ASIC design flow depended on the major semiconductor companies getting ASIC businesses going and they were slow to do that," Rhines says. Once it began, though, third-party tools became necessary, making the Daisy and Mentor integrated systems big businesses.
ASIC design using standard cell-based methodologies spurred the next major sea change in design automation. In the mid-1980s, digital designs ramped in size until the complexity again became too much.
The first part of the paradigm shift was the development of an abstraction level above gate level in the form of hardware description languages (HDLs). There were a number of these vying for acceptance, including what was originally known as very-high-speed IC (VHSIC) HDL, or VHDL. But Verilog, developed at Gateway Design in 1984 along with an event-driven simulator, drew the most attention (see "A Look Back At Verilog").
Synopsys' introduction of commercial logic synthesis was the second part of the mid-1980s shift. Work on synthesis had taken place at IBM in the early '80s in the form of rule-based systems such as the Logic Synthesis System (LSS).
Early synthesis tools, such as the first versions of Design Compiler in 1986, were focused on optimization. The tool helped designers meet timing constraints while minimizing area (see "Technology X: The 'What-If' Proposition").
The biggest driver behind logic synthesis was the move up in abstraction from gate-level design to register-transfer level (RTL). Moving up to RTL design, and then synthesizing RTL back down to gate level, didn't change the fabric of design, which even now remains predominantly standard-cell-based. "Moving up to RTL was mostly a tool thing to master complexity," says Camposano. "That's when we really started using HDLs to design chips more or less the way you write programs. You'd write it and compile all the way down to automatically placed and routed circuits."
Logic synthesis represented a much more optimal way to translate and transform an abstract description of a circuit to a gate-level schematic than previous methods. That profoundly impacted the design process, notes Cadence's Vucurevich. For some time by the late 1980s, there had been the notion of the "tall, skinny designer," or a designer who had the insight and skills to take a project from conception all the way through realization. The design methodology brought to fruition in the ASIC age of logic synthesis re-established the division of labor between the front and back ends of the design process.
But synthesis advocates saw in the methodology potential to join these realms again through behavioral synthesis. This approach would start from a level above RTL, which is to say a purely behavioral description of the system, and go all the way down to gates. The idea harks back to Carver Mead's concept of silicon compilers. These were worked on for a time in the mid-1980s but ultimately abandoned as a less efficient, and less broadly applicable, alternative to synthesis.
Broad acceptance of synthesis brought with it an expanded design automation industry in the form of a raft of "point tools," or tools for specific verification and analysis tasks throughout the design flow. Throughout the '90s, the synthesis-based front-end flows were refined and improved to keep up with Moore's Law.
Today's engineers are on the verge of yet another sea change in the EDA landscape. A hallmark of the design automation evolution has been that it began at the lowest possible levels of abstraction, or the physical domain. The masking process saw design automation first. "That's because the most complex parts are the closest to the mask, which is where you have the most objects," says Synopsys' Camposano. It was far easier to create a database of polygons and print them on a plotter than to draw them by hand.
Place and route fell next because it was the next level up in terms of the number of objects to be handled. Then, once standard-cell methodologies were developed, logical design was the next target.
If we follow the thread, behavioral methodologies are the next obvious stage in automation. A new level of abstraction, higher than RTL, will allow system-level designers to specify the function they want, then have a wide range of implementation options. One initiative to watch in this area is the Metropolis initiative of the Gigascale Silicon Research Center, which strives to create a metamodel for functionality based on abstract algebra that assumes no form of implementation.
Meanwhile, today's cell-based ASIC-style design methodologies push forward into the system-on-a-chip (SoC) age. The clean handoff between the domains of logical and physical design that has been the dominant paradigm of the ASIC age is no longer so clean in the era of SoCs and ultra-deep-submicron (UDSM) silicon fabrication. In fact, it's collapsing quickly as the dominance of interconnect in delay calculations grows. Front-end designers can no longer "throw their designs over the wall" to designers who specialize in physical implementation without knowledge of how their work is expressed in silicon.
Thus, a new era has begun in design tools that merges the logical and physical (see this issue's cover story, p. 45). Going forward, there will be much more emphasis on signal integrity and analysis of physical effects at UDSM geometries. All of these issues are ways in which EDA tools will aid designers in conquering the complexity that's to come as silicon fabrication continues to keep pace with Gordon Moore's Law, 37 years old but still holding true.