At 480 Mbits/s, Signal Integrity Becomes An Issue In USB 2.0 Designs

Sept. 18, 2000
Packaging, board layout, and chassis grounding will all be severely impacted by these high-speed components.

USB 2.0 features a more complex and demanding specification than USB 1.0. Therefore, it has implications in packaging, board layout, and chassis grounding that go beyond the considerations for USB 1.0 designs. Additionally, USB 2.0 host, device, and hub implementations must go through compliance testing before the official USB 2.0 logo can be applied. In spite of these obstacles, developers of this advanced serial bus expect a quick migration to USB 2.0 as PC manufacturers respond to users demanding higher-performance PC-centric applications.

This migration will be aided by a new generation of high-speed silicon components that can be used in USB 2.0 designs. Still, engineers will have to hone their design skills if they wish to take advantage of the performance boost dangled before them. Because they operate with a bandwidth extending up to 480 Mbits/s, these high-speed chips require that careful attention be paid to the factors that impact signal fidelity.

While IEEE-1394 is excellent for peer-to-peer applications, PC-centric USB 1.0 applications, such as printers and scanners, can benefit from the faster performance of USB 2.0. At 480 Mbits/s, this serial bus offers data throughput that's 40 times greater than its predecessor, USB 1.0. Network and access implementations for Ethernet, xDSL, and cable modems, as well as mass storage implementations like removable hard-disk drives and backup devices, will benefit from the higher bandwidth of USB 2.0. Plus, isochronous devices, especially desktop cameras, can benefit from the increased speed.

Basically, the difference be-tween USB 1.0 and 2.0 devices is data rate. While there's little change from the user's perspective—other than higher bandwidth—manufacturers of PC and peripherals will be able to design higher-speed slave peripherals with a user-friendly interface. Furthermore, some industry experts feel that USB 2.0 might eliminate the need for some other high-bandwidth interfaces, such as SCSI adapters. That would lead to a reduced need for connectors, resulting in lower total-system cost.

Those involved in drafting the new specification were very aware of the existing user base, and compatibility was carefully considered. As a result, USB 1.1 connectors and full-speed cables can support USB 2.0 with no change. Likewise, USB 2.0 hubs and hosts will be able to detect and support low-speed and full-speed peripherals (see "USB 2.0 At A Glance," p. 158).

For the designer of USB 2.0 products, however, new design and test challenges exist, including termination networks, cable and driver impedances, and driver strengths, as well as driver rise/fall times.

Board Design Considerations USB 1.0 was a very forgiving specification. At low data rates, placement of the USB 1.0 components in relationship to clock and connectors wasn't critical. In addition, electromagnetic interference (EMI) at edge rates of 12 Mbits/s isn't particularly troublesome. But, current board-layout implementations that were convenient for USB 1.0 components may require significant redesign for USB 2.0 components. New board-routing guidelines require closer attention to attenuation, jitter budgets, package/board/chassis design, and EMI/EMC shielding.

The first thing to deal with is the device itself. It's imperative that designers know the characteristics of the USB 2.0 device being used. This helps to ensure conformance to specifications in the board layout.

The attenuation/jitter budget from silicon transmitter to silicon receiver is very tight at USB 2.0 data rates. Package pin locations, pin/wirebond impedance, and the proximity of VDD/GND pins must be taken into consideration. In the less-forgiving world of USB 2.0, the selection of a device from among competitive offerings may boil down to the best pinout.

Consider, for example, that the better-performing devices won't put high-speed pins in the corners of the package, because wirebonds are the longest in the corners. Instead, high-speed pins should be placed in the middle of one of the sides of the package where the wirebond length will be the shortest. The board designer must keep trace lengths to a minimum. Therefore, when selecting a multiport design, the best choice might be a chip where all high-speed ports are on the same side of the package. If high-speed ports are laid out one per side, then the board designer may be unpleasantly surprised as traces are laid down.

In the design of a board containing USB 2.0 functionality, the number of layers, power and ground placement, and connector placement have become significantly more critical. In USB 1.0 designs, it was possible to place the device in the middle of the board with connectors up to 8 inches away at the edge of the board (Fig. 1). But, that isn't the case with USB 2.0.

The preferred board layout for USB 2.0 is illustrated in Figure 2. As you can see, the source and connector are very near to each another. There are several other features to this layout that are important.

Match Trace Impedances At the higher data rates of USB 2.0, board layout must follow tighter rules. It's possible to perform a USB 2.0 implementation using a four-layer board. Trace impedance matching, however, is the key. Long runs increase the risk of signal degradation due to cross-coupling and noise from nearby devices. Likewise, the distance between traces must be controlled.

Running signals through vias creates an impedance change that can be significant. Even 90° corners will create problems. As data rates go up, these sharp corners can become excellent antennas, radiating energy. They also impact trace-impedance mismatching because they force additional length on one of the trace pairs.

Watch Coupling Of Power Planes Crossing power plane divisions results in common-mode shifts. For example, when signals are run over a ground plane and then over a 5-V supply plane, coupling suddenly changes from ground to 5 V, which induces unwelcome level shifts. At USB 1.0 rates, the design could probably survive, although that will be less likely at USB 2.0 rates.

Higher data rates also mean that jitter and attenuation become much greater obstacles. Figure 3 illustrates what can happen with a high-speed signal as it travels from the transmitter (Fig. 3a) to the receiver (Fig. 3b). As a result of reflections and clock source inaccuracies, jitter closes the data eye from left to right. Reflections might be caused by impedance mismatches along the signal path. Attenuation, which closes the data eye from top to bottom, can be caused when the trace width drops below the recommended level (in this case 0.007 µin.) or when a cable or connector has high series resistance.

The IEEE-1394 specification requires two differential pairs, which simplifies the task of recovering clock and data signals. These signals can be extracted by simply implementing a logic function on both pairs. USB 1.0 and 2.0 specifications require only one differential pair. This makes it more difficult to recover the clock and data signals. With USB 1.0 at 12 Mbits/s, one could perform a brute-force oversampling of data using a 48-MHz clock to extract the data. But at the USB 2.0 rate of 480 Mbits/s, this approach would require a 1.6-GHz clock. Because sampling at these speeds isn't practical, it's necessary to use a more sophisticated clock/data recovery method.

With the single differential pair that's used for USB 2.0, the ability to read at the middle of the data eye is critical. One method for performing clock and data recovery is to use a delay-locked loop (DLL) to generate a number of sampling clocks. With this approach, the designer must incrementally select a different tap until it's at the middle of the data eye.

A 480-MHz clock cycle is 800 ps long. By sampling data every 80 ps for a few clock periods, it's possible to compare the data recovered from each tap and "vote" on the results. Over several clock cycles, it's possible to determine the best sample point—that is, the center of the data eye.

Terminations also are treated differently with USB 2.0. In the earlier USB 1.0 specification, terminating resistors were placed off-chip. In USB 2.0, the resistors are on-chip, which is important to remember when upgrading a USB 1.0 board design to USB 2.0. Some silicon vendors are attempting to make USB 2.0 devices that are pin-compatible with USB 1.0 devices. If you go this route, then you must remember that the off-chip terminating resistors on board have to be shorted out. Furthermore, when using this approach, the designer must check that the reflections from the shorted terminators don't adversely affect signal quality.

Another design criteria that developers of USB 2.0 must pay attention to is the balancing of input and output levels. This is critical if one wants to have acceptable signal clarity. The differential voltage swing in USB 2.0 is 800 mV, significantly less than the 2-V swing in USB 1.0. Therefore, increased care must be taken to ensure that both traces of a differential pair are precisely the same length. This will eliminate right-angle turns and some other placement strategies that could have been tolerated at USB 1.0 performance levels.

Much like in the video cassette recorder (VCR) market, where there's a move to provide video connectors on the front of the VCR, peripheral connectors on desktop PCs are migrating from the back to the front of the box. Most PC manufacturers currently provide easy access to both USB and 1394 connectors on the front of the case. Manufacturers are using a small printed-circuit card that connects via a cable back to a header on the main logic board (Fig. 4). This arrangement adds another level of design challenge to USB 2.0 designs. The addition of two connectors, a 4- to 8-in. cable, and a second small printed-circuit card introduces four potential impedance discontinuities in the 480-Mbit/s signal path.

Given the increased concern for signal integrity that comes from higher performance specifications, use of time domain reflectometry (TDR) is one of the few ways to accurately determine impedance matching. In a perfect transmission line with perfectly matched terminations, all of the energy launched from the step generators would be absorbed by the termination. At the other end, no reflections would be present. In the real world, however, various elements, like connectors, vias, cables, and termination components, alter the impedance of the transmission line and cause reflections. By characterizing the magnitude and time of the reflections, one can determine the magnitudes of the excess capacitances and/or inductances as well as their physical positions along the transmission line.

The trace taken with TDR equipment in Figure 5 shows the impedance traveling along a main board, going through a header and down a cable to another header, and then traveling along traces on a small printed-circuit card. The horizontal axis of this trace can be viewed as distance while the vertical scale is impedance. The ideal trace would be flat with no changes in impedance along the signal path. This example shows four distinct changes in impedance along one 480-Mbit/s signal path.

Onto an existing system, USB 2.0 capability may be added by using a PCI add-in card (Fig. 6). This solution solves many of the challenges of a USB 2.0 implementation, and it is the traditional way to add new capabilities that might later appear on the motherboard.

Many, if not all, of the early IEEE-1394 and USB 1.0 implementations were PCI add-in cards. The PCI add-in card can physically be very small, which lends itself well to precise control of trace length and impedance. It can eliminate the routing issues imposed by multiple power planes. It also allows optimum placement of the transceiver IC to the USB connector. Finally, it's neutral with respect to front-panel connectors.

Power management isn't strictly a USB 2.0 issue. Yet, the maturation of power management functionality and customer demand has caused it to become an important design issue that must be addressed. As CPUs become more power-hungry, extending battery life on new laptops is an increasingly problematic concern. The guiding principle in power management is simply to "turn off as much as possible." In many of today's systems, the only component that's "alive" during deep sleep mode is the keyboard.

For instance, in its PC 2001 specification, Microsoft goes into great detail regarding power management, defining a number of system states (D0, D1, D2, and D3) for various activity modes, from being awake, to sleep modes, to full shut-down. Specifically, D0 is the active state at full power; D1, D2, and D3 have limited power, while maintaining the capability of the PC to "wake up" without rebooting. D1 has LSP on and an active port, but the PCI clock is off, or very slow. This power level is approximately 50% of the active state. D2 and D3 (hot) maintain minimal power at less than 1 mA by turning Link Power Status off and suspending port activity. The D3 (cold) state removes power from the chip and eliminates the possibility of "wake up."

One of the new functionalities that USB 2.0 host controllers need to support in PC 2001 systems is Vaux. This auxiliary power supply keeps part of a system powered up even when the rest of the system is "asleep."

EMI becomes an issue with USB 2.0 implementations because of the faster edge rates of a USB 2.0 device. Hopefully, now they will be placed close to the front or back panel of the chassis.

While proper grounding is crucial, other implementation details must be observed too. One obvious concept is to generate the fewest possible emissions. This requires careful design. If all of the signals are perfectly matched in a differential signal environment, then the rising edge will always cancel the falling edge, and no EMI will be generated. Unfortunately, this is rarely, if ever, possible.

Both the USB 1.0 and 2.0 specifications call for a differential bus, so the task of controlling emissions is much less onerous than it would be with a parallel bus, which typically isn't a differential design. Nonetheless, certain events cannot be differential, such as bus reset, where both pins go high simultaneously, or disconnect, where both lines go slightly above "legal" logic 1 voltage. The high-speed signaling mode (chirp) also isn't completely differential.

None of these challenges is new to the industry, and numerous IEEE-1394 designs have conquered similar problems and passed EMI testing. So, designers who have had previous experience with IEEE-1394 designs should have few problems with the new USB 2.0 specification.

The designer's choice of components can have a significant effect on the need for emissions control. Chokes, particularly the new 4-wire common-mode choke, are a standard way of reducing fluctuations. They have proven to be quite effective in IEEE-1394 designs. The problem is that they add to the overall cost of the product. Chip manufacturers like Lucent make it possible to eliminate this cost by providing transceivers that don't require chokes because they generate significantly less EMI.

If the transceiver used in the design has good differential characteristics, and the board layout is designed so that parasitics are matched carefully, EMI can be reduced significantly by simply observing the basics. This includes making sure that the I/O shield is connected securely to the chassis and the receptacle.

EMI testing is time-consuming, exacting, and expensive. Because testing needs to be performed at a special facility with isolated chambers to ensure accurate results, most companies prefer that EMI testing be conducted only once per product. The best strategy, therefore, is to do as much preliminary testing as possible during the design phase. The design tips presented above provide a guide.

EMI testing requires scanning the product through frequencies of up to 1 GHz to see if any signal amplitudes outside the specification exist. According to the USB 2.0 specification, extrapolating EMI by scanning only a few frequencies isn't allowed. It's possible to perform "quick and dirty" testing at the design site using several antennas covering different frequency ranges and taking into account both horizontal and vertical orientations.

Because of the increasing electronic pollution in our environment, however, only a specially isolated chamber can assure an accurate test. For example, Lucent recently conducted a preliminary test with an EMI sniffer and discovered a spike at 900 MHz, despite the fact that nothing in the system ran at 900 MHz. The culprit turned out to be a newly installed cellular telephone tower.

USB 2.0 provides a high-performance upgrade path for PC users. It supports the higher-speed peripherals that are increasingly in demand and offers backwards compatibility with USB 1.0 peripherals, hubs, and hosts.

The USB 2.0 specification calls for more rigorous design and test criteria than its predecessor, but designers who have had experience with the IEEE-1394 specification can take heart. With the 1394 specification leading the way, much of the groundwork for meeting the USB 2.0 specifications has already taken place. The added performance and flexibility of a USB 2.0-enabled PC make mastering the new design challenges well worth the effort for the designer and for the end user alike.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!