Instruments Play New Roles On The Benchtop And The Desktop, Too
Like a fine bouillabaisse, instruments are being blended with software, and what's emerging from the caldron is both evolutionary and revolutionary. The evolution is, as we would expect, driven by the continuing upward spiral of frequency and data rate. But the revolutionary capability stems from a most novel innovation in which signals and their software emulation become virtually interchangeable. This leads to an unusually powerful feature for those engaged in electronic design automation (EDA).
What's so astonishing about this event is how ingredients, both real and virtual (signals and software emulation), have been combined quite seamlessly to assist all who develop products to complete their designs more rapidly than ever. At the relatively humble beginnings of the oscilloscope and the logic analyzer, there was barely a hint of the powerful capabilities today's instruments provide routinely.
Though the oscilloscope predates World War II, it didn't begin to move into an aggressive development phase until the 1950s. One of the pioneers in oscilloscopes is Walter LeCroy Jr., the founder of LeCroy Corp. (see "Oscilloscopes: From Simply Viewing To Complex Signal Analysis," p. 76).
Another person who has both witnessed and participated in the oscilloscope's evolution is Ed Caryl, an oscilloscope applications engineer at Tektronix. He has been active in scope design for over 30 years. "There has been a four-order-of-magnitude progression over the past half century, with bandwidths at 5 MHz in the '50s climbing to over 50 GHz at present," he says.
"Back in the '50s, a user was lucky to get an oscilloscope to edge trigger," Caryl continues. Now, scope users can choose from a cafeteria of trigger sources, from very narrow pulses to very complex descriptions of parallel- and serial-bit streams. But according to Caryl, even these steps forward were wild dreams just a few years ago.
As the years passed, the kinds of measurements that oscilloscope users could perform on signals grew much more complex. Just 10 years ago, oscilloscopes were mostly analog in nature, only displaying a waveform. There was very little computation in the general-purpose oscilloscope, other than basic measurements.
Now oscilloscopes are actual PCs that do many complex computations on their own. Today, with the dramatic connectivity improvements to PCs, both internal and external, one can feed waveforms into multiple applications, such as Java and Windows. This has vastly expanded the designer's ability to analyze waveforms.
As for digital scopes, 10 years ago record lengths were 1000 to 2000 samples. Now 20 to 30 Msamples are commonplace. Also, 10 years ago the typical bench scope provided a bandwidth in the 400- to 500-MHz region.
"Of course, there were scan converters up to 5 GHz, but there was no vertical amplifier. You simply directly connected the scan converter tube or a CRT, with no attenuators, no amplifiers, and no ability to modify the signal. You scanned the signal, digitized it, and thus captured it in some crude form," says Caryl. Now real-time scopes offer designers 6-GHz bandwidths with 100,000-point record lengths, plus the ability to respond to complex triggering and perform sophisticated analysis.
Will sampling and real-time scopes merge in the next 10 years? Caryl thinks that's not very likely. The gap is wide: 6 GHz for real-time scopes and 55 GHz for sampling versions. It will be very hard to reach the higher levels in real time due to the front-end circuitry in a real-time scope. The attenuators and amplifiers limit the bandwidth before you get to the analog-to-digital converter. In a sampling scope, these circuits aren't an issue because the sampling bridge captures the signal. It's right out in front, immediately behind the 50-W connector.
The logic analyzer, on the other hand, began in the early 1970s. One of its pioneers was Chuck Small of Agilent Technologies, then Hewlett-Packard (see "How Microprocessor Systems Defined The Logic Analyzer," right). Another major innovator of the logic analyzer is Chuck House (see Logic Analyzers, Then And Now," p. 80).
In the case of Tektronix, some early logic analyzers were spinoffs from scopes. "The first logic analyzers were actually scope plug-ins," remembers Dave Holaday, an electrical engineer with Tektronix's logic-analyzer hardware engineering group.
Originally, memory depths were very shallow. One early logic analyzer had an 18-channel card with a memory depth of no more than 1 kbit per channel. "Since the depth was so shallow, you could actually take an acquisition and scroll through the data without taking very long to go through the entire data stream," says Holaday.
At that time, processors were typically Zilog Z80s that only ran at 4 MHz. In the early days, designers dealt with 5-V CMOS and TTL logic with some ECL support. As speeds were far slower than they are today, you could pretty much ignore signal integrity matters. You really didn't have transmission line issues and ringing problems.
But higher speeds have driven devices to lower voltages. So, we have smaller margins around the transition when determining whether the signal is high or low. Now there are 400-mV logic families. Moreover, edge rates that were 5 to10 ns have shrunk to the 100- to 200-ps range, placing all of the analog issues upon us—noise, ringing, overshoot, crosstalk, and the like.
As Holaday recalls, "We moved to the 300-MHz versions in the late '80s." He notes that Tektronix announced an 8-GHz analyzer in April (Fig. 1). Looking ahead, Holaday sees the data package becoming larger and the memory getting deeper. He expects logic analyzers to look more and more like a software tool.
Poised to revolutionize RF circuit design is the recently introduced ability to link test equipment with a design automation tool. This link enables an interchange between the real and virtual worlds—that is, the ability to correlate simulated and measured performance data for a design and verify performance.
Tying together EDA software with measurement and analysis instrumentation generates new ways to accelerate the design process. The components are Agilent's Advanced Design System (ADS), an EDA software tool, and an instrument like Agilent's 89600 Vector Signal Analyzer (VSA).
ADS harnesses a variety of automation tools that can fulfill the design requirements of the coming generations of wireless systems. The VSA is a spectrum analyzer that employs vector properties to preserve the phase and magnitude of signals that it analyzes.
A typical flow starts in the design environment, where modeling occurs. It enables designers to develop transmitters, receivers, virtually any set of RF components, and subsystems. Once initial modeling has been completed, data from the design simulation can be streamed to a signal analyzer, like the 89600 (Fig. 2a).
The 89600 software takes that data and puts a number of built-in tools to work—such as DSP analysis, fast Fourier transforms, and error vector magnitudes. After the 89600 analyzes the signal, the results are streamed back to the CAD tool to tweak, correct, and enhance the simulation (Fig. 2b).
The ideal signal is streamed into the I/Q generator, which can output the signal to the real world, upconverting it to whatever frequency is required. Then that signal is applied to actual hardware, the device under test (DUT) (Fig. 2b, again).
Next, the 89600 hardware measures what's arriving from the DUT, and the results are fed to the 89600 software. Now the "measured response" going back to the design environment is from real hardware.
"DSP code" consists of equalization algorithms intended to reside in the receiver. These parameters are fed directly to the DUT, in effect programming the chips within the DUT. Designers can port this data to any destination, as denoted by the three arrows exiting from the 89600 software.