EDA Can't Afford To Ignore Test Chips Any Longer

July 15, 2005
Looking at today's digital ASIC and system-on-chip (SoC) designs, one could be forgiven for thinking that front-end design engineers and back-end layout designers would make great examples in the best-selling book, "Men Are From Mars; Women Are From Venus

Looking at today's digital ASIC and system-on-a-chip (SoC) designs, one could be forgiven for thinking that front-end design engineers and back-end layout designers would make great examples in the best-selling book, “Men Are From Mars; Women Are From Venus.” Front-end and back-end engineers “talk” different languages, perceive their worlds in different ways, and really seem to have very little in common.

Recently, however, these two camps have come much closer together. For example, design engineers now use layout-aware tools such as physically aware synthesis, and front-end tools increasingly take into account layout, design-for-test (DFT), and design-for-manufacture (DFM) considerations. However, a new divide is forming. Both design engineers and layout designers are huddled together on one side of a chasm, while the other side represents the unknown territory of proving a new IC process or technology node and extracting and documenting the design rules for that process or node.

Enter The Test Chip Today, the term “test chip” means nothing to the vast majority of software engineers who develop EDA tools and hardware engineers who design and lay out digital ICs. But this has got to change. Let’s take a step back to consider how a new technology node is qualified. We commence with the concept of a single transistor. A parametric test chip involves creating copy after copy after copy of this transistor, each with slight variations in attributes like contact spacing and the widths and lengths of various features.

Furthermore, the way in which a transistor behaves may vary depending on the proximity of other structures. This means that a complete test matrix will require multiple copies of each transistor variant, with each of these copies surrounded by portions of other transistors, and the proximity of those structures varied from copy to copy.

Variation does not stop here. There are literally thousands of different events comprising a modern wafer-fabrication flow, and variations in each of these events can affect the transistors' characteristic. In the case of an ion-implantation step, for example, we might vary implantation energy, angle, and dopant level. Similarly, in the case of a furnace cycle, in addition to the quantities and mix of dopant used, varying the ramp-up time, the sustained time at the maximum temperature (along with the value of this temperature), and the ramp-down time all can affect the final device's performance.

A tremendous number of variables can affect the electrical properties of the transistors, too. When test chips are eventually built, different parameters, such as switching times, threshold voltages, leakage currents, etc., are evaluated for each physical and process permutation to determine the optimum process sequence.

Surprisingly—even today—the masks used to build these test chips are predominantly created by hand. That is, they're done by some poor soul drawing polygons one at a time, hour after hour. As you might expect, it’s not uncommon for someone to have a “bad day," whether it's misdrawing the sizes and positions of polygons or omitting contact holes.

Even worse, keeping track of the different process-step variations and the relationships between them, along with the resulting transistor parameters, usually is performed using standard spreadsheets. Resorting to disjointed, non-specialized tools is typical wherever automation is lacking in technology development. It’s a daunting task under the best of circumstances to manage the phenomenal amounts of data associated with one of today’s ultra-deep-submicron processes. So it's not surprising that errors creep in.

Given the disjointed tools used to manage the data involved with a new technology, recognizing that there's an error in the first place—and then tracking such an error down—can take a tremendous amount of time and resources. Worse still, if the error goes unrecognized, the mistakes in the test chip can lead to erroneous conclusions about the direction to look for optimum device design or process conditions.

Automating The Test-Chip Process The answer, of course, is to automate the test-chip creation and evaluation process. The first level of automation provides a library of core elements, such as bipolar junction transistors, field-effect transistors, passive devices (such as capacitors), and so forth. This library includes parametrized structures capable of generating transistor variations, based on dozens of individual parameters like the poly-to-contact spacing, minimum feature size, and widths and lengths of key device dimensions. This library also must let users quickly and easily add new fundamental components as required.

The next step in the automation process is to provide an interface that allows users to specify the minimum, maximum, and step values for each device parameter. The system then lays out the test chip with all of the possible variations of transistors (including variations regarding proximity to other structures). In addition to automatically generating the GDSII files required to construct the chip’s photomasks, the system outputs details on the locations of the transistors and their probe points. The downstream automatic test equipment uses this information to determine each device's electrical characteristics.

But that’s only the start. The system also needs a way to easily capture and describe the thousands of different events comprising the wafer-fabrication flow. This includes the permissible variations in these events, such as ion implantation energies, dopant levels, heat times and levels, and so forth. The system then uses this information—along with the details of every different transistor variant—to construct and populate a humongous database.

As each wafer of test chips is constructed and evaluated, all results are imported into the database. The system must then provide an intuitive and easy-to-use interface. Such an interface would allow wafer engineers to sift through the reams of data and establish the complex relationships between process steps and transistor structure variations in order to determine the “sweet spot” for this process and technology node.

Why Should The Folks In EDA Care About All Of This? In addition to establishing the optimum “mix” of process steps and transistor parameters required to build the wafer, the ensuing values are used to define the design and electrical rules (such as design exchange format, or DEF, files) employed by design engineers and layout designers. At the moment, capturing these rules in appropriate formats for the design team is largely a manual process. However, the complexities of these processes and time-to-market considerations make this modus operandi increasingly untenable. In the not-so-distant future, these rule libraries will be generated automatically by the same software that generates and evaluates the test chips themselves.

When test chips are created by hand, it can take two to three months to build the first wafer; thereafter, one can generate one or two respins of the test chip every one or two months. By comparison, automating the process reduces the time to generate the first wafer to one to two days (if using an existing library element) or two to three weeks (if defining a new library element from the ground up). Thereafter, it’s possible to generate hundreds of variant test chip wafers in a matter of hours. This means that any foundry or IDM using an automated process holds a huge advantage over its non-automated competitors.

Similarly, automatically generating the design and electrical rules used by design engineers and layout designers will convey a significant market advantage to the groups using these automated processes. As adoption of automated data-management systems for semiconductor-process-technology development confers competitive advantage, everyone from software engineers developing EDA tools to hardware engineers designing and laying out digital chips will be hugely aware of what a test chip is and what it does.

Contact Tim Crandle at [email protected].

Sponsored Recommendations

Design AI / ML Applications the Easy Way

March 29, 2024
The AI engineering team provides an overview and project examples of the complete reference solutions based on RA MCUs that are designed for easy integration of AI/ML technology...

Ultra-low Power 48 MHz MCU with Renesas RISC-V CPU Core

March 29, 2024
The industrys first general purpose 32-bit RISC-V MCUs are built with an internally developed CPU core and let embedded system designers develop a wide range of power-conscious...

Asset Management Recognition Demo AI / ML Kit

March 29, 2024
See how to use the scalable Renesas AI Kits to evaluate and test the application examples and develop your own solutions using Reality AI Tools or other available ecosystem and...

RISC-V Unleashes Your Imagination

March 29, 2024
Learn how the R9A02G021 general-purpose MCU with a RISC-V CPU core is designed to address a broad spectrum of energy-efficient, mixed-signal applications.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!