Many engineers are getting onboard with third-generation serial-data technology in their next system designs, and for good reason. These intra-system communication schemes move a lot of data in a hurry, and that’s what comsumer devices require.

For example, the Serial Advanced Technology Attachment (SATA) 3.0 standard jumps to 6 Gbits/s, a transfer rate made necessary by the proliferation of flash-memory drives and the lightning-fast read speeds these drives deliver.

But the adoption of any new standard-based protocol for in-system busing is going to mean rethinking your compliance-test strategy. In this article, we’ll look at what some of the test industry’s experts have to say about what you’ll need to keep up with functional and compliance test requirements. We’ll also run down some best practices for serial-link testing (see “Third-Generation Serial Test Dos and Don’ts).

What’s Driving Test Requirements?
Serial standards organizations started talking about third-generation standards about a year ago. In design labs around the electronics industry, third-generation standards are growing in importance and prevalence.

“What I have seen emerge as a big difference from 2G is how critical it is to be able to see the receivers’ eye patterns,” says Chris Loberg, senior technical marketing manager at Tektronix. “This brings up a number of possibilities for a system designer or system architect. What can we do in terms of equalization for that signal so that the receiver is tricked into seeing ones and zeros in a clear pattern? Evaluating equalization on the receive end is the new art for design engineers in this space.”

A number of factors centering on the nature of the signals themselves make the equalization of signals at the receive end of a serial link necessary. For one thing, the stress from the complexity of signaling and speed of the clock is shrinking the eye pattern (Fig. 1) to the point that it’s almost completely closed.

“That signal complexity, due to long strings of zeros and ones, combines with high speeds and tighter margins to cause problems,” says Loberg. “Place this signal across an FR-4 printed-circuit board and it isn’t really suitable but very cost efficient. You end up with a very stressed communication system.”

Thus, in a circuit board or backplane with FR-4 that was built for a 400-MHz clock, the result is stress on the receiver that drives up inter-symbol interference (ISI), periodic jitter, and random noise.

New Emphasis On Receive End
For Michael Fleisher-Reumann, Agilent Technologies’ strategist for high-speed test, the receive end of serial links is a critical area of interest.

Continue to next page

“With the third generation, most serial-data standards went up in transfer rate by an order of magnitude, as in the move from USB 2.0 to USB 3.0. Most standards don’t mandate a lot of measurement on the receiver side,” says Fleisher-Reumann.

“Instrumentation is costly and represents a huge investment for the designers. But with the third generation, the standards bodies have done a lot of work in specifying new requirements for receiver test,” he adds.

For designers, this means coming to terms with how to measure the performance of receivers. “There is no signal coming out, only an input signal to the receiver. It’s a new paradigm of measurements. For many designers, it’s a learning curve,” Fleisher-Reumann says.

Automation To The Rescue
For some time now, the large test vendors in the market have tried to give test engineers a hand through more versatile equipment and more automation of the test process.

“We try to deliver automated test solutions with separate software that will prompt the user for certain setup information and then allow them to use the scope to make the measurements as per the specification,” says Chris Busso, product marketing manager for LeCroy’s high-speed serial data solutions.

“What I have seen lately is the use of different types of testing. For USB 3.0, for example, eye diagrams and jitter analysis are done at the far (receiver) end of the channel. In reality, they’re still done at the transmitter but you move your reference point through emulation. Then the instrument extrapolates the effect of the channel. You can quantify the losses in a serial data channel and get the S parameters for those losses. We were among the first test vendors with the ability to load in the S parameter files and show you what it would look like at the far end,” Busso says.

One way in which LeCroy has helped automate receiver testing is to build the reference equalizer required by the USB 3.0 specification into its software. “In our Eye Doctor toolset and ED 2 software, you can enter an S-parameter file, equalize the signal after that, and take measurements at the receiver. The result is more realistic testing,” says Busso.

In general, serial-link receivers now need equalization. This is especially true for PCI links. “In the past, to calibrate the stress signal for the receiver, you would measure as close as possible to the input to the device to see if the signal you sent has the right amplitude and vertical eye opening,” says Agilent’s Michael Fleisher-Reumann. “You had to take care to measure correctly but it was possible. Today the specification relates to a point inside the ASIC after equalization is done. When you measure input, you have to take the signal through simulation to see if the eye opening is right.”

Another difference in today’s receiver testing is that if the channel between transmitter and receiver is of a relatively long length, the instrumentation’s accuracy is sometimes insufficient. “The intrinsic noise and jitter of scopes from all vendors is so dominant and significant that the eye opening you see on the scope is much smaller than it really is,” says Fleischer-Reumann. Thus, even after performing post-processing on the signal to yield an eye diagram, the true eye is probably much larger.

Continue to next page

Averaging is the accepted method of reducing noise on the signal. “You measure a clean signal for calibration,” says Fleischer-Reumann. “Then you generate a signal with the frequency response of the one you want to measure, taking note of the droop and slow rise times. You average this signal and use it as the input to a simulation that optimizes the signal. After all of that, you can now see in simulation what the proper eye-diagram opening should look like.”

All of the above points to the fact that it takes quite a few steps to get to where you can see the optimized signal in a simulation. It also emphasizes the need for properly calibrated instrumentation as well as appropriate simulation tools, such as those provided by the PCI Express Special Interest Group (SIG). The bottom line is that serial-link receiver test is a complicated process that can cause headaches and misunderstandings.

Pointing Toward The Plugfest
“For the most part, testing for these standards is all about interoperability. All these devices have two halves, one side talking to the other. You have to be sure your device works with as many others as possible,” says LeCroy’s Chris Busso. “You don’t want to be the USB vendor who plugs your device into a motherboard and it doesn’t work.”

Designers assess the interoperability of their devices during plugfests. Most often organized by the standards bodies themselves, plugfests are an opportunity for developers to find out whether their serial-link designs meet a given standard’s specifications for interoperability. At the plugfest, they’ll find out whether their system, motherboard, host bus adapter, disk drive, cable, or whatever the end product may be passes muster.

“At the plugfest suite, we’ll have gear set up and people will measure their devices,” says Busso. “People will go to other vendors’ rooms to check on interoperability with a wide array of systems and devices.”

Many tests are conducted at plugfests, says Busso. “Eye diagrams and jitter are at the heart of the tests. There will be several other tests done as well. For example, they’ll test the rate of their spread-spectrum clocking and the amplitude. There’s voltage amplitude tests. We’ll make sure it’s operating at the right bit rate and that there’s the correct amount of de-emphasis on the signal.”

Plugfest testing is typically scored on a pass/fail basis. “You don’t go to a plugfest to see if you pass or fail. You go to certify that you pass,” says Busso. “In the lab, testing can take much longer with significant debugging involved.”

The PCI 3.0 Picture
It can be instructive to take a look at how testing requirements evolve for a given standard. The evolution of PCI Express toward a stable third-generation standard is expected to culminate by the end of 2010 with a base specification. “Typically there’s a 0.7 spec, followed by a 0.9 spec,” says Fleisher-Reumann. “A 0.71 version was published to members in May and was accepted. We’re now working on a 0.9 spec. There will be lots of changes from 0.71 to 0.9. We’re shooting for November for release of the 0.9 specification.”

Continue to next page

In the case of PCI Express, while transmitter measurements have been in place for some time, receiver testing is still very much a work in progress. “There was no receiver test for earlier versions of PCI Express,” says Fleisher-Reumann. “When PCI 2.0 came along, receiver test was specified but not mandatory. For PCI 3.0, we are not done with the standard yet, but there is a receiver test that must be done. This is a very new thing for many customers.”

With a stress signal for the receiver defined, we have entered the decade of PCI Express receiver testing. PCI Express transmitter testing, though, only requires an oscilloscope and appropriate software for analysis.

“Transmitter testing has been done on high-speed serial links for a long time,” says Jim Choate, USB product manager at Agilent Technologies. “It was assumed that if interoperability works and the transmitter works, the receiver is probably okay.”

Receiver testing is more rigorous, requiring a signal sent with very precise parameters. “For receiver test, you need a bit-error-rate tester with jitter (JBERT),” says Fleisher-Reumann (Fig. 2). Such instruments have a loopback mode and can count the errors that are looped back. Also, they can generate and analyze many patterns.

While working at Intel before joining Agilent, Choate learned the importance of loopback testing. “We did lots of receiver validation with loopback mode,” says Choate. “We found that this would be needed for future new USB or PCI Express standards. The length of the channel plus the data rate influences how much testing you have to do. For a long channel, you can have a clean looking transmitter but the channel can vary so much that it can break the link.”

So while transmitter testing is well understood for serial data links, receiver testing remains the larger challenge. “One thing that changes drastically for USB 3.0, and probably for PCI Express 3.0, is where you measure signals. For USB 2.0 we’d measure right at the output of the transmitter connectors with no channel in the path. For USB 3.0, we now emulate the loss of the channel to the receiver. The receiver has a closed eye and has to have equalization,” says Choate.

Need For Flexible Test Gear
These days, test vendors find themselves spending quite a bit of time consulting with engineers on how to make serial-link measurements. “It’s a little deeper consultation than in the past,” says Tektronix’s Chris Loberg. “Now, in some of the system suppliers and chip design labs, we spend more time telling them how to stress test, how to add filters and taps, and how to interpret the views associated with the output.”

  Loberg stresses that it’s important to assess the measurement task at hand when considering equipment selection and the approach to measurement. “Engineers trying to debug their third-generation communications systems need equipment that can provide deeper insight (oscilloscopes) on the measurement result beyond ones and zeroes. They should be coupled with a signal generator that can provide nuances like noise as stressors (AWGs),” says Loberg.

  For verification, a product like Tektronix’s BERTscope is a useful tool because it can provide a more straightforward digital signal with some stressors and a bit error-rate (BER) view along with scope-like insight. For pure compliance/characterization, a BERTScope or BERT and a scope are suitable in today’s third-generation standards due to the need to see behind the pass/fail results that will emerge beyond a simple BER number.