When the speed of onboard buses hovered below 5 Gb/s, measuring signal integrity might have sufficed as the basis for releasing a new circuit board design to high-volume manufacturing. Not anymore.
As the throughput of various buses like PCI Express Gen 3, SATA 6Gb/s, Intel’s QuickPath Interconnect, and memory buses such as DDR3/4 continues to climb, designers and test engineers are searching for more comprehensive and statistically predictive techniques for validating new product designs. One such method is system marginality validation (SMV). The problem stems from the limitations—both physical and cost/benefit—of oscilloscopes. It doesn’t help that the purchase price of an oscilloscope is directly related to the speed of the buses where the scope is capturing the waveforms to measure signal integrity. The faster the bus, the more costly the scope. High-speed scopes can easily carry a price tag in the $300,000 to $500,000 range. In that rarefied atmosphere, it’s hard to justify the limited value added by such an expensive instrument.
In contrast, SMV is a statistical methodology that takes advantage of instruments already embedded in silicon all over a circuit board to produce a more comprehensive and accurate projection of system performance over time. Unlike the signal integrity snapshot provided by a scope, SMV takes into account the inevitable variations in component performance over a long manufacturing run. In fact, components that are either very close to the edge or else totally out of spec may not cause problems in system performance until much later when the product is in the hands of users. Since SMV is based on embedded instruments, it sees what the silicon sees and, as a result, alerts engineers to potential problems down the road. Of course, it doesn’t hurt that software-based SMV methods are much more cost-effective than today’s expensive high-speed scopes.
The limitations of signal integrity
Basically, oscilloscopes measure signal integrity by examining the signaling on a bus and relating the voltage of the signal over time. The resulting display will show the frequency of the signal and take the shape of a waveform. The intent is to determine the robustness of a transmitter and the link itself by capturing and examining the waveforms traversing the bus.
Unfortunately, every instrument, no matter how sophisticated, has physical and economic limits. Scopes support a finite number of channels. As a result, engineers can only effectively measure signal integrity on a few traces, especially when the pressure is intense to move the product into manufacturing as soon as possible. The longest and shortest traces on a board design prototype might be selected, for example, but this leaves the engineer worrying about all those other routes that were not measured. For example, an inconspicuous route might not be measured, but because it is in close proximity to a voltage regulator, it could pick up significant noise from the regulator. Once the system is shipped, this seemingly inconsequential trace might lead to intermittent system crashes.
In addition, scopes require external probe heads and amplifiers, which drive up costs again. And it might take weeks to set up the access to the right traces on the board. Of course, the external access required by an oscilloscope will likely only be possible on early prototypes of the hardware since subsequent spins of the design will eliminate these test access pads to ensure better signal integrity on the manufactured boards.
So what is SMV?
Over the last decade or more, chip designers have been embedding test and measurement instruments into their devices because this was the most effective way to characterize, validate, and test the chips. Intel, for example, introduced its Interconnect Built-In Self Test (IBIST) embedded instrumentation technology more than 10 years ago. Since then, Intel IBIST has morphed into SiliconView Technology to facilitate user test processes, and it remains an important enabling technology for the company. Other chip, ASIC, and FPGA suppliers have followed suit, and now board-level tools are applying these same instruments to validate chip-to-chip interconnects and other aspects of circuit boards, such as the structural and electrical integrity of the system.
SMV utilizes software-based tools to tap into these embedded instruments to calculate the statistically valid operating margins on prototype circuit boards of a new board design. Then, designers can associate a degree of confidence with the design as it moves into high-volume manufacturing.
Because of variations in component performance and manufacturing processes, the operating margins on any one circuit board can vary widely from the margins on another board that was manufactured at a different time or in another place. The point of SMV is to determine the probability that a manufactured board would operate as expected over its useful life considering the variations in component performance and manufacturing processes. If the operating margins on a design are too tight, a slight swing in component performance, for example, might lead to intermittent crashes in the hands of users. With ample margins, it is less likely that a poor performing component will threaten the operation of the system because there is more operating margin in the system.
Of course, an engineer’s degree of confidence in SMV projections will depend on the validity of the statistics generated from the SMV data that has been gathered. Among other factors, statistical validity depends on the size of the sample, which in this case means the number of prototype boards from which data was gathered. Many organizations that have incorporated SMV into their standard design validation processes have adopted the “5 x 5” rule. That is, data is gathered five times from five different boards with five different sets of silicon in each lot of prototypes produced. Based on this sample size, SMV calculations will have a high degree of confidence.
An example of board-level SMV
Besides cost-effectiveness, ease of use, and labor savings, SMV also provides certain functionality that oscilloscopes simply can’t. For example, SMV tools can be very helpful in determining an effective programmable equalization setting for a particular peripheral device or a range of devices that might be connected to a particular bus.
Preliminary experimentation on a SATA 6Gb/s bus has shown that in some cases the discrete time linear equalization (DTLE) setting in the BIOS must be adjusted to accommodate the different margins on vendors’ hard drives, solid-state drives (SSD), different cable lengths, connection types, and other factors. For example, on a notebook computer design, the DTLE setting in the BIOS might be established at a certain level that would be optimal for a hard drive since this is the type of device most often configured in notebook computers. This setting would generate an optimal “eye diagram” graph (Figure 1) of the bus’s operating margins.
Figure 1. Margins on a SATA 6Gb/s bus with the DTLE setting for a hard drive and a hard drive connected to the bus Green indicates the bus is operating within its margins |
Although the DTLE setting works quite well with a hard drive, it may not be the optimal setting for other types of devices that may be connected to the bus, such as SSD. When designers expect a variety of devices to be connected to a certain bus, determining a middle ground setting to accommodate the range of possible devices typically would be more effective. Figure 2 shows what can happen to the margin graph when the DTLE setting for a device connected to SATA 6Gb/s is less than the optimal. The margins are not so favorable in this case.
Figure 2. Operating margins on a SATA 6Gb/s bus when the DTLE setting in BIOS is less than optimal Yellow and red indicate operations that have exceeded the margin |
This type of margin graph indicates that the connected device would likely run considerably slower than it should, and the link to the drive might be subject to intermittent failures over the product’s life. For the best results for both types of devices shown in Figure 1 and Figure 2, the DTLE setting should be adjusted to some point in the middle so that various types of storage devices are accommodated.
Tuning the DTLE with a scope is an entirely manual process that could take weeks of engineering time. Instead, software-based SMV tools can automatically sweep through a range of operating conditions while capturing voltage and time margins. Ultimately, an optimal DTLE setting for the SATA 6Gb/s bus that will accommodate a range of different storage devices from various vendors and operating at different speeds can be found.
Conclusions
With the inevitable progress of technology, SMV has emerged from the shadows of signal integrity measurements to become a much needed validation tool for modern circuit board designs. SMV does not subtract nor detract from the information provided by oscilloscopes; it supplements this data and enhances it. Combining signal integrity measurements and SMV calculations in the knowledge base for a design gives engineers and product managers a much higher degree of confidence in the performance potential and reliability of the design over its life cycle as a product.
About the author
Tim Caffee is vice president of design validation and test and a founder of -ASSET InterTech. Prior to ASSET, he spent several years at Texas Instruments. Caffee obtained B.S. degrees in mathematics and computer science at Old Dominion University. [email protected]