Electronicdesign 7861 1014veririzzattipromo
Electronicdesign 7861 1014veririzzattipromo
Electronicdesign 7861 1014veririzzattipromo
Electronicdesign 7861 1014veririzzattipromo
Electronicdesign 7861 1014veririzzattipromo

Hardware Emulation: A Weapon of Mass Verification

Oct. 29, 2014
Greater time-to-market pressures, along with escalating hardware/software integration and quality concerns, make the verification process a strategically important step in chip design. Coming to the rescue is a new generation of cost-effective hardware emulators.
Dr. Lauro Rizzatti, Verification Consultant

Despite the third adjustment to Moore’s Law, which now sets the doubling of transistors in an integrated circuit to about two years, the march continues on toward larger and larger designs/devices.

The average design size today hits or exceeds 50 million application-specific integrated-circuit (ASIC) gates, with individual blocks topping out at over 10 million gates. Top-end designs in leading semiconductor companies routinely surpass 100 million gates. The largest designs in the processor/graphics business have already reached one billion gates, or will surpass that number in the near future.

Driving this steep rise in design complexity is demand for new features and capabilities either added to existing products or implemented in completely new designs. As if the soaring silicon complexity isn’t enough, the swell of embedded software adds another dimension as it becomes a product differentiator. The pressure to get products to market faster is unbearable, forcing engineering teams to get creative.

Limitations of Verification Methods

While engineers have an abundance of verification choices, most come with inherent disadvantages. In the early days of digital circuit design, when chip complexity ranged between a few hundred to a few thousand gates, designs were verified via breadboard prototyping. With a breadboard populated by TTL logic devices, such as SSI/MSI chips, a design could be verified and debugged in its target system environment before fabrication of the silicon. Because the design was tested under real operating conditions, functional correctness was assured.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Breadboarding became impractical when chip complexities reached 10,000 gates, and was eventually replaced by the logic simulator based on an event-driven algorithm. This change spawned the electronic design automation (EDA) industry, under the early acronym of CAE (computer-aided engineering). Event-driven simulators, still in use at the register transfer level (RTL) and seldom at the gate level, allow for accurate functional and timing verification.

RTL simulators are easy to use, cost effective, and have sophisticated debugging capabilities. However, their execution speed rapidly degrades when design sizes reach 100 million gates due to cache misses and memory swapping. While parallelizing simulators through PC server farms might alleviate the runtime plunge, they cannot be used to test embedded software, which is a fundamentally serial process.

Executing one second of real time in a 100-million-gate circuit designed to run at 200 MHz would require the execution of 200 million cycles. Even on the fastest CPU with a generous cache size and a vast amount of RAM, an RTL simulator would require more than three weeks at 100 cycles per second –– an optimistic assumption –– to run through the design.

This poor runtime performance prevents an event-driven simulator from testing a design in its target system. Event-driven simulators are best used to simulate tiny fractions of real device behavior, which means that functional failures go undetected and engineering teams incur costly design respins.

For all of the advantages that formal or static verification methods have to offer over test vector generation, they cannot test design functionality. Dynamic testing using testbenches is the only choice available, especially when embedded software –– software drivers, real-time operating systems (RTOSs), or custom applications –– must be tested.

Hardware verification languages (HVLs), such as e and Vera as well as C/C++ class libraries of test functions, increase productivity through the massive generation of tests that cannot be created manually. Functional verification coverage tools increase an engineer’s level of confidence in testbenches produced with a HVL, but do not reduce the amount of time required to apply those tests and cannot be used when developing embedded software.

Hardware-assisted verification tools narrow the gap between the engineer’s goals and results from traditional logic verification. Let’s look at a few.

FPGA Prototyping

FPGA-based prototyping is used to address embedded software validation. Prototypes are basically breadboards with FPGAs replacing the SSI/MSI parts.

In-house-developed FPGA prototypes have been around since the advent of these programmable devices. Unfortunately, the development of an FPGA prototype grows exponentially with the increase of design size and becomes impractical when the number of required FPGAs exceeds 10 or so devices. Debugging an FPGA-based prototype is awkward, forcing the engineer to work on complex prototyping issues instead of debugging the design. It’s often heard that a home-built FPGA prototype is ready when the design has already been taped out.

To address these shortcomings, a cottage industry came into existence several years ago with the goal of supplying ready-made and scalable FPGA prototypes. This trend is gaining traction if judged by the sheer number of small companies offering FPGA proto boards. They remove the cumbersome in-house development process –– the main reason for their success. Even in the largest configurations, they address designs with about 100 million gates.

Having the fastest execution speed, short of the speed of real silicon, is the main selling point of a proto board. Conversely, very long setup time to map a design into a proto board and rather limited design visibility are its two main drawbacks.

Hardware Emulation

Hardware emulation is becoming a popular solution to runtime problems that afflict event-based software simulators. It achieves execution speeds five to six orders of magnitude faster than that delivered by RTL simulators.

Hardware emulators have not always been viewed favorably. Previously, the prohibitive cost of ownership limited adoption to engineering teams testing the largest designs, such as microprocessors and graphics chips. That’s changed as new generations of hardware emulators capable of handling up to a couple of billion or more ASIC gates offer lower overall cost of ownership. This makes them a good choice for a broad range of designs, regardless of complexities and topologies.

Hardware Emulation as a Verification Solution

Hardware emulation performs well as the ultimate bug killer. It can be invaluable for debugging hardware and testing hardware/software integration within an SoC ahead of first silicon.

When hardware designers and software developers both use emulation, they can share the same system and design representations. Thanks to combined software and hardware views of the design, they can work together to debug hardware and software interactions. They are able to trace a design problem across the boundary between the embedded software and the underlying hardware to determine whether the problem lies in the software or in the hardware.

A debugging methodology based on multiple abstraction levels starts with embedded software at the highest level and moves down in abstraction to trace the behavior of individual hardware elements. Starting from a database of multi-billion clock cycles, a software debugger can localize a problem to within a few million clock cycles. At this level, the software team identifies the source of each problem in the software code. Or, they call in the hardware team to use a software-aware hardware debugging approach to zoom into a lower level of abstraction.

Such multi-level debugging would not be possible with RTL simulators, because they are too slow to effectively execute embedded software. Likewise, the same methodology would not be possible with FPGA-based prototypes since they lack visibility and access into the design to trace hardware bugs.

Modern Hardware Emulation Platforms

Hardware emulation systems in today’s design environment run faster and are easier to use with a cost of ownership at a fraction of earlier incarnations (see the figure). These highly scalable, cost-effective tools can handle up to a couple of billion ASIC gates. They boast rapid setup time and fast compile time, offering a powerful debug environment and support for multiple concurrent users. The best of the breed are packaged within an environmentally friendly footprint.

This comparison of RTL simulation, FPGA prototyping, and hardware emulation is based on performance, setup/compile time, design capacity, and design debug.

What’s more, hardware emulators can be deployed as emulation systems driven by physical target system in-circuit emulator testing (ICE). It can be used as an acceleration system driven by a virtual, software-based testbench written in any combination of Verilog, VHDL, SystemVerilog, C++, or SystemC, at the signal level or at the transaction level. Alternatives include synthesizable testbenches or test vectors.

Typical performance is approximately 2 MHz on a 10-million-gate design and a top speed of 1 MHz on 100-million-gate designs. When design size increases, an emulator’s performance degradation is low compared to that of an event-driven simulator.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Greater time-to-market pressures, along with escalating hardware/software integration and quality concerns imposed on engineering teams, make the verification process a strategically important step in chip design. A new generation of cost-effective hardware emulators provides a great choice for state-of-the-art designs. That’s why hardware emulation has become the weapon of mass verification.

Dr. Lauro Rizzatti is a verification consultant. He was formerly general manager of EVE-USA and its vice president of marketing before Synopsys’ acquisition of EVE. Previously, he held positions in management, product marketing, technical marketing, and engineering. He can be reached at [email protected].

Sponsored Recommendations

Board-Mount DC/DC Converters in Medical Applications

March 27, 2024
AC/DC or board-mount DC/DC converters provide power for medical devices. This article explains why isolation might be needed and which safety standards apply.

Use Rugged Multiband Antennas to Solve the Mobile Connectivity Challenge

March 27, 2024
Selecting and using antennas for mobile applications requires attention to electrical, mechanical, and environmental characteristics: TE modules can help.

Out-of-the-box Cellular and Wi-Fi connectivity with AWS IoT ExpressLink

March 27, 2024
This demo shows how to enroll LTE-M and Wi-Fi evaluation boards with AWS IoT Core, set up a Connected Health Solution as well as AWS AT commands and AWS IoT ExpressLink security...

How to Quickly Leverage Bluetooth AoA and AoD for Indoor Logistics Tracking

March 27, 2024
Real-time asset tracking is an important aspect of Industry 4.0. Various technologies are available for deploying Real-Time Location.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!