Verify Your System-Level Designs With A Virtual Prototype

Oct. 18, 1999
In Today's Competitive Marketplace, Here's What You Really Need To Know About Verification To Keep You On Top.

Unlike density-driven IC design, system design is primarily propelled by diverse functionality demands. Many of today's system designs include analog, digital, electromechanical, and software components. Historically, system-level designers verified their designs by building a physical prototype. But many of them can no longer afford this luxury. In general, physical prototypes are a time-consuming, expensive, and difficult way to complete a project. Thus, virtual prototyping is becoming a necessity (Fig. 1).

Depending upon whom you ask, the term "virtual prototyping" has a number of different meanings. In the context of system verification, it means building an executable model of a design for simulation (Fig. 2). That model is used to verify and test various elements of the entire system design without actually committing any part of the system to hardware.

A pager is a good example of a system that benefits from virtual prototyping. It has digital and analog circuitry, an RF circuit, and a processor running software, as well as mechanical components such as a vibrator and keypad. Building an accurate physical prototype for a pager is a complex problem. Constructing that prototype to run at speed is even more difficult—if not impossible.

Physical prototypes have a number of drawbacks. For starters, debugging them is more cumbersome and difficult than it might be for a virtual prototype. Designers must devise some sort of lab setup, including various instruments to both stimulate the design and capture and analyze results. If problems are discovered, they must be isolated using different probing techniques. Problems are then fixed by modifying the physical prototype. At best, this process is very time consuming and frustrating.

In a virtual environment, everything is much less difficult. It's rela tively simple to look at a particular register value. Designers can set up the simulation environment to monitor a value with a command such as, "monitor register a." A window will then display the contents of that particular register.

Setting up equipment to capture register values in the physical domain can be tough. Signals may be completely inaccessible if they're internal to a component on the prototype. Because designers can't access these signals directly, it essentially becomes a guessing game. The designer may suspect that a certain part is faulty, replace that part, and determine if the system works properly—and repeat the process several times, if necessary.

When problems are found, it's hard to make modifications to the physical prototype. Designers may have to physically remove one or more components, put new ones in, and then properly rewire those parts. Furthermore, if the prototype is a printed-circuit board, it may mean having to re-route the board.

It's a minor task to replace logic or circuitry with alternative circuitry in the virtual environment. Just replace a model or modify a schematic or hardware-description-language (HDL) file. These issues, combined with today's ever-increasing time-to-market demands, make it tremendously challenging to build an accurate physical prototype.

In contrast, the verification process can be completed faster and with greater detail using virtual prototypes. Designers can move from concept to verified implementation in less time. Plus, making modifications to a virtual prototype is much simpler because the designer only needs to modify a model—not an actual piece of hardware. Even the iteration loop of finding and fixing problems is faster.

This process also tends to be more thorough—a trait that's a byproduct of the verification performance. Designers are likely to do more verification because they have the time, since they move through the design process more quickly. They even can use special-purpose tools that indicate how thoroughly they've performed verification. For example, code-coverage tools can determine how much of the design is actually exercised. In the physical domain, there's no comparable way to measure completeness.

The virtual-prototype debugging environment also is more robust. Designers can debug graphically, probe internal signals more easily, see registers, and watch events happen dynamically as they run an application. And they can easily do tradeoff analysis. They can explore the effects of replacing one portion of a design with some other circuitry or see the results of using an alternative processor in a relatively short period of time. Compare that to the complexities of changing a processor in an actual physical system.

Working virtually, designers can verify systems before all of the physical system components are available, as long as there's a model of those components (Fig. 3). The design team doesn't have to be held up waiting for an ASIC to be fabricated. Software developers can start developing software for a system right away.

Normally, the process is a serial one: Design the hardware, debug it, and then hand it off to a software team for application development. In theory, virtual prototyping can cut product development time significantly by giving software programmers a prototype up front in the design cycle.

It also can help refine ASICs or other custom parts before they're built. As previously stated, an ASIC is often constructed and debugged before any software is written for the system. If problems are found once the system is built, it may be too expensive to go back and modify the physical hardware even if that approach is the best solution.

The design team may have to compromise and implement a software-based fix or some other type of work-around. Designers working in a virtual environment don't need to make those types of compromises. They can implement the hardware change without drastically impacting the bottom line.

Finally, consider the cost benefits that a virtual prototype has over its physical counterpart. There's an initial cost associated with virtual prototyping, which comes from investing in the tools and the training needed to implement the methodology. That cost can be amortized over several design projects. In contrast, there is the fixed cost of building a physical prototype for each new project, making this approach more costly over time.

The benefits of going virtual are somewhat dependent on the type of design being verified. There will be times when designers, for various reasons, may need to use a physical prototype. In general, though, designs that are hardest to verify with a physical prototype will reap the most benefit when verified virtually. These systems contain diverse functionality or are in a specialized form factor.

Of course, building a virtual prototype for these same designs is also trickier than doing so for simpler designs. The point is that there may not be as much benefit to building a virtual prototype for a simple pc card as there is for, say, a cellular telephone design.

If virtual prototyping holds such great promise, why aren't most design teams adopting it? The simple answer is because neither the tools nor the mindset are fully in place yet. Various EDA tools can be used to verify portions of a system design. But few, if any, can verify the complete system. Verifying combined analog and digital technology is a tough problem, but one that can be addressed.

The ability to include and verify electromechanical content, however, is almost nonexistent. Typically, systems must be partitioned and verified piecemeal, and designers are forced to build physical prototypes for system-level verification.

Ideally, system designers need an enterprise-wide and integrated EDA environment that will allow them to create virtual prototypes. It's important that the verification environment be integrated into the overall design environment. Designers need to be able to simulate executable models for system-level verification. But this verification environment must be integrated with place-and-route, design-capture, signal-integrity, and all other tools in the overall design flow.

In addition to the needed change in the design tools, designers must alter their mindset to adopt this new methodology. Think back to how designers reacted to the advent of gate arrays several years ago. Many who embraced the new technology were accustomed to verification with hardware prototypes. But there really was no way to build a prototype of a gate array. So they were forced into simulating their design prior to committing to hardware.

Reaching A Breaking Point System-level design has reached a similar inflection point, in the sense that building a prototype for a sophisticated system can be really hard. As stated, a virtual prototype that accurately represents the target system can save a lot of time and money, just as simulation did for designers performing early ASIC design. In fact, ASIC designers now consider simulation a required step in their design process prior to fabrication.

So what will it take to develop a system that would allow designers to create and verify virtual prototypes? For one thing, the EDA environment must let designers create executable models of system-level designs with diverse functions, including digital, analog, RF, and mechanical content. Today, good tools and algorithms exist that address the verification of these individual design functions. But what's missing is an environment that will integrate the necessary algorithms and allow them to work collaboratively to solve the total system-level verification problem.

For a single system design, this could involve integrating a Spice engine for the analog portion, an HDL engine for digital logic, some sort of software debug environment to address software content, and a 3-dimensional solver or finite-element analysis for the mechanical portion. Portions of such an environment are in use today. And designers do utilize mixed analog-digital modeling and hardware-software verification to some extent. But no environment ties all of those things together effectively.

Interfacing an event-based HDL simulation algorithm with a Spice algorithm isn't easy. The digital algorithm analyzes discrete events and values, while the analog one works with continuous waveforms. RF circuitry would require yet a third algorithm to analyze circuits in the frequency, rather than time, domain. The biggest challenge is making the algorithms in various domains interact efficiently and accurately.

To work toward such an environment, EDA developers can leverage the technology that's available in specific domains. But they must develop algorithms and technology that would allow the engines to work together. Application-specific solvers need to be expanded to address a broader range of problems spanning a system's diversities.

Technology also is needed to synchronize the operation and data of various simulation engines. Certain tradeoffs are associated with these solutions. These interfaces, for instance, often cost simulation speed and/or accuracy.

Bridging System Domains The simulator interface provides a bridge between dissimilar domains. Interfacing techniques vary, depending on which domains are being linked. For mixed digital-analog simulation, for example, there are ways of mapping the discrete values of an HDL simulator (ones and zeroes) into the continuous representation of the analog domain.

The process is analogous to that of a digital-to-analog converter. On the other side of the interface, an analog-to-digital converter of sorts maps analog values, such as threshold voltage, into discrete values. Such mapping raises many issues in terms of accuracy and implementation details, like how to represent the digital concept of an unknown signal value in the analog domain.

Difficult mapping issues exist between other domains as well, such as analog-to-RF or digital-to-mechanical. Some of these can be significantly more complex than the digital-to-analog example just discussed.

Consider that typically, a mechanical simulation is a static solution. It's usually a solver-based approach that includes a representation of a mechanical component and a system of equations that solve for a particular parameter. For a hard-disk drive with a voltage-controlled armature, the equations may determine how fast the platter is rotating when driven at a certain voltage level.

Or, if a digital input directs the drive to a certain location of the disk, the equations may solve for how much time the drive needs to get to that point. Mechanical analysis generally doesn't include a concept of time-based simulation. Yet for virtual prototyping, the mechanical simulation must interface with time-based analog or digital algorithms.

EDA vendors must develop the necessary mapping techniques. Today, the limited solutions that interface two domains are merely a subset of the overall problem. Obviously, to do a full system-level verification, they'll probably be more involved than mixing two different types of simulation. Interfaces are needed between all of the possible algorithms, because a system design may contain any combination of domains.

Possible Solutions In spite of all the difficulties and missing pieces associated with a virtual-prototyping environment, designers can take measures right now to begin implementing such a methodology. Using existing tools, they can certainly verify subsections of the system independently. They can then manually look at the results, assemble all of the subsections together, and verify that they will interoperate correctly by employing the results from one section as a stimulus for another. This would, of course, require the time to convert data from one domain's format to another—for example, manually converting analog waveforms to discrete digital values. These manual approaches, however, preclude the possibility of running application software on the virtual system.

Designers also have the opportunity to modify their existing environments to better accommodate virtual prototyping. Utilizing files and scripts to bridge the analysis of various domains, they can develop their own interfaces. Some design teams write their own interfaces that allow them to capture data from one simulator, re-format it, and provide it as an input automatically to another simulator. This approach creates a somewhat loosely coupled environment. It also requires a lot of work, and is only effective in some cases.

It's difficult, for example, for a team to write the interface if they're verifying a system with a tight analog-digital feedback loop. That mapping gets cumbersome when done manually because of the close interaction between the two sections. These homemade scripts can be generic, but tend to be customized toward the particular system being verified.

It's really the EDA vendors that need to develop the virtual-prototyping environment for designers. They don't necessarily need to scrap existing interface software, but rather build upon some of the existing solutions.

Difficulties arise because the expertise and algorithms involved in solving domain-specific problems tend to come from different sources. For example, an EDA vendor may have expertise in RF simulation, but not know much about mechanical analysis. Creating an integrated virtual-prototyping environment potentially requires bringing together multiple organizations.

The ideal environment doesn't consist merely of interfaces between simulators. It also needs some overall control mechanism. The verification system must be more than just boxes interacting. Some high-level control of the environment has to make sure all simulation and data is properly synchronized. In addition, the virtual-prototyping environment must support the team-based efforts inherent in system design.

Future system designs promise to become even more complex and diverse. Design teams, faced with Herculean verification tasks, will no longer have the time or resources to build physical prototypes. EDA vendors must develop the tools to create a viable virtual-prototyping environment. To do this, they can build upon existing and proven technology. The environment must offer high performance and be easy to use and well integrated with other tools in the design flow. Finally, for adoption of this methodology to be a success, designers must be educated about using the virtual-prototyping design process.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!