Tek Dpo73304dx 02b H Dpojet Opt M Phy 5ccaff8cae01d

The recent history of today’s high-end oscilloscope technology

June 6, 2019
30131704 © Dreamstime.com
TechXchange

Engineer's Guide Oscilloscope Techniques

This TechXchange is all about oscilloscopes, including their selection, techniques, and vendors.

Download this article in .PDF format.

Editor’s note: Evaluation Engineering’s April print issue features a special report on the trends, challenges, and new products available in the high-end oscilloscopes market. That report includes commentary from Mike Martin, senior marketing product manager at Tektronix. The report actually included only a very small portion of Martin’s commentary, which was so in-depth that it deserved its own article. So, that’s what we’ve done here, turning Martin’s additional insights into an article covering the history of modern high-end oscilloscope technology and where it’s headed.

By Mike Martin, senior marketing product manager at Tektronix

First, let’s define a “high-end oscilloscope.” There really is no official definition. Scope vendors arbitrarily draw a line that allows them to focus on the applications requiring the highest bandwidth acquisition. The line has moved considerably over the past decade. Ten years ago, we used a definition of “high-end oscilloscope” that covered a range of 4 GHz to 20 GHz bandwidth. Today, that definition has changed to a range of 23 GHz to 70 GHz.

When it comes to talking about the biggest trends in high-end oscilloscopes, a good starting point is to look at the biggest trends in new technology, products, or system development. For many years, data communication has driven demand for high bandwidth acquisition and analysis. This continues to be the case today, but the increase in performance is stunning. Ten years ago, we were working on Gen1 and Gen2 standards such as PCIe and SATA. Data rates were at or below 5 Gbps and it was largely held that scope bandwidth needed to provide coverage through the 5th harmonic of the fundamental of the signal being tested. A 12.5 GHz scope bandwidth could provide a good quality acquisition and representation of a 5 Gbps (2.5 GHz fundamental frequency) NRZ signal.

Since then, data rates have continued to climb, along with the need for more bandwidth. As we moved into Gen3 and Gen4 standards, data rates went up into the 8 to 16 Gbps range. This drove scope bandwidth requirements up to 25 GHz and 33 GHz.

The industry could no longer ignore external losses caused by cables and fixtures as the margins for passing a test became tighter. New tools were needed to de-embed the effects of losses external to the scope and the scope vendors responded with features that would allow the scope to compensate for these effects. It was also at this point where the old 5th harmonic goal for scope bandwidth became obsolete, as the devices being tested often did not have the bandwidth to produce spectral content at the 5th harmonic.

About the same time, most designers of high-speed serial links came to depend on receiver equalization to recover closed eyes. This resulted in the need to simulate this equalization on the scope, as the scope acts as a proxy receiver for a lot of compliance testing for various standards. Scope vendors have now implemented robust simulation of receivers, including flexible Feed Forward Equalization (FFE), Decision Feedback Equalization (DFE), and Continuous-Time Linear Equalizer (CTLE) models.

Now, we come to the latest generation data communication solutions. Electrical signaling is now up to 32 Gbps on a single lane. Optical is being used widely—not just for long distance communication, but also for very short distances. On top of this, new wireless standards such as 802.11ay are being created that provide performance in the 20 Gbps to 100 Gbps range. These are part of the 5G landscape but are also being used to replace existing WiFi infrastructure.

The ever-growing demand for moving more data in shorter periods of time is pushing developers to use higher data rates and more sophisticated modulation schemes such as PAM, OFDM, QAM (e.g. 16 QAM, 64 QAM, 256 QAM), and combinations of these modulation schemes to pack the data into the shortest possible time to transmit. This drives demand onto the oscilloscope for demodulation capability for electrical, optical, and RF signals.

With these very high bandwidth communication channels, the designers have moved to dynamic tuning of the transmitter and receiver equalization. This happens through a defined link training process that might last on the order of 500 ms during system power up. In this process, hundreds of thousands—or perhaps millions—of small packets are transmitted back and forth between the transmitter and receiver, as the two attempt to arrive at a “most optimal” channel equalization result.

Debugging problems in this process can be very challenging, as the process happens once at startup and then is over. In addition, it is difficult to inspect the million or so link training packets to identify a problem. This has driven scope vendors to provide automated tools that can decode the link training protocol, remove the redundant data from the analysis cycle, and flag things that might be indicating problems.

Today, the bleeding edge of data communication is the Tbps research work that is being done primarily around coherent optical methods, which routinely uses scope bandwidths in the range of 70 GHz. The developers need to demonstrate methods that produce reliable communication at these data rates and verify that BER and EVM demonstrate the targeted quality of communication. These users need coherent optical receivers with very high bandwidth and low noise, as well as a very high-bandwidth scope. The scope becomes a proxy receiver for what will one day be the final channel receiver.

There are many other applications that are not specifically related to data communication, but require high-bandwidth scopes. Some examples include automotive radar (60 GHz to 77 GHz), testing of other types of radar systems, SatCom testing and more. Historically, these have often been tested using spectrum analyzers. However, some of these applications now require an instantaneous acquisition of a very wide band signal, and spectral analysis software running on the scope can perform the needed acquisition and analysis.

What are users looking for in high-end oscilloscopes?

In general, most users today want excellent signal fidelity. As the margins for tests become smaller and smaller, the user wants to know the scope is not consuming a large amount of that margin which could result in a false failure of the test on their system. They need enough bandwidth to be able to faithfully represent the signal being tested, but not more bandwidth than the signal contains as unused bandwidth just means more noise in the measurement. As data rates increase, especially on multi-lane communication, the possibility of crosstalk effects between lanes becomes higher. As such, it is important to provide tools that help identify the presence of crosstalk such as bounded uncorrelated jitter measurements.

The workhorse of the “high-end” scopes has historically been 8-bit digitizers. Higher resolution is generally a good thing, but it is not achieved by just changing the digitizer itself. The entire front-end of the scope must also have the noise performance, linearity, and dynamic range to support the higher resolution A/D. Effective Number of Bits (ENOB) is the IEEE defined method to demonstrate overall signal fidelity of an acquisition system (see spec IEEE-1057). Without a front-end that can support higher resolution, the value of the higher resolution A/D is lost. Currently, Tektronix is producing 12-bit resolution scopes with bandwidth up to 8 GHz.

Connectivity is always an important element of working with oscilloscopes. Conventional probes are available up to 33 GHz today. It is important that the probe provide a faithful conveyance of the signal without significant degradation. For example, Tektronix probes used at this performance level are characterized during production and S-parameter data is stored in the probe. When the probe is connected to the scope, the scope applies this S-parameter data as an automatic correction to the signal, thereby providing a calibrated measurement to the probe tip and the most accurate representation of the signal being acquired.

Mike Martin is the senior marketing product manager at Tektronix. He can be contacted at [email protected]

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!