The Fundamentals Of Spectrum Analysis

March 22, 2012
We know that spectrum and vector signal analyzers give us deep insight into a signal's properties, but do we really know what's going on under the hood? What are we to make of all those specifications? Agilent's Erik Diez goes through the basics to help you get more out of your spectrum analyzer.

Download this article in .PDF format

The analysis of electrical signals, otherwise known as signal analysis, is a fundamental challenge for virtually all electronic design engineers and scientists. While it provides valuable insight into a signal’s properties, signal analysis is only as good as the instrument with which it is performed. Spectrum analyzers and vector signal analyzers are two instruments commonly employed to analyze electrical signals. This tutorial covers the basics of how to make the best use of these instruments.

Table Of Contents

  1. Background
  2. Spectrum Analysis Fundamentals
  3. Inside The Spectrum Analyzer
  4. Understanding Analyzer Specifications
  5. Modern Signal Analyzers
  6. Summary
  7. References

Background

Among the most popular and versatile of all measurement tools, the spectrum analyzer measures the magnitude of an input signal versus frequency. Its primary use is to measure the power of the spectrum of known and unknown signals. By comparison, the vector signal analyzer measures the magnitude and phase of an input signal within the analyzer’s intermediate-frequency (IF) bandwidth. It is typically used to make in-channel measurements, such as error vector magnitude, code domain power, and spectral flatness, on known signals.

Historically, spectrum analyzers and vector signal analyzers have been separate instruments, but technology advances have made it possible to combine both capabilities into a single instrument. For simplicity, signal analysis and spectrum analysis are commonly called simply spectrum analysis.

The powerful measurement and analysis capabilities of these analyzers enable engineers to realize a rapid and complete understanding of their device or system under study. Of course, understanding how they work and how to make them more effective for any particular application is absolutely critical to using them to their fullest potential.

Spectrum Analysis Fundamentals

While knowledge of analysis instrumentation is essential, equally important is an understanding of the fundamentals of spectrum analysis itself. Traditionally, oscilloscopes have been used to see how electrical signals vary with time. However, they don’t provide the full picture.

To fully understand the performance of a device/system, a signal (or signals) must also be analyzed in the frequency domain. This is exactly what the spectrum analyzer does. It should be noted, however, that with great advances in digital technology, the distinction has become a little fuzzier. Some oscilloscopes can perform vector signal analysis, and signal analyzers now have significant amounts of time-domain measurement capability. Nevertheless, oscilloscopes are optimized for time-domain measurements, and signal analyzers are the tool of choice for frequency-domain measurements.

In the frequency domain, complex signals (e.g., comprising more than one frequency) are separated into their frequency components, and the level at each frequency is displayed. Frequency-domain measurements have several distinct advantages. For one, information not discernible on an oscilloscope becomes readily apparent on a spectrum analyzer.

Measuring signals with a spectrum analyzer also greatly reduces the amount of noise present in the measurement due to the analyzer’s ability to narrow the measurement bandwidth. Moreover, many of today’s systems are inherently frequency-domain oriented and must be analyzed in the frequency domain to ensure there’s no interference from neighboring frequencies. 

With a frequency-domain view of the spectrum, it is easy to measure a signal’s frequency, power, harmonic content, modulation, spurs, and noise. With these quantities measured, total harmonic distortion, occupied bandwidth, signal stability, output power, intermodulation distortion, power bandwidth, carrier-to-noise ratio, and a host of other measurements then can be determined using just a spectrum analyzer.

Frequency-domain measurements (spectrum analysis) are made with either a fast-Fourier transform (FFT) analyzer or a swept-tuned receiver. The FFT analyzer takes a time-domain signal, digitizes it using digital sampling, and then applies the mathematics required to convert it to the frequency domain. The result is displayed as a spectrum. With its real-time signal analysis capability, the analyzer can capture periodic, random, and transient events and can measure phase and magnitude.

The swept-tuned analyzer “sweeps” across the frequency range of interest, displaying all the frequency components present. This enables measurements to be made over a large dynamic range and wide frequency range. The swept-tune analyzer is the most widely accepted, general-purpose tool for frequency-domain measurements. 

Both the FFT and swept-tuned analyzer can be used for a wide range of measurements (such as frequency, power modulation, distortion, and noise measurements) in applications as varied as spectrum monitoring, spurious emissions, scalar network analysis, and electromagnetic interference. Measurements are made in the –172-dBm to +30-dBm range, over frequencies ranging from 3 Hz to greater than 325 Hz.

Inside The Spectrum Analyzer

Understanding how a spectrum analyzer operates requires a look inside at its hardware (Fig. 1). As we’ll see later, modern signal analyzers have replaced quite a bit of the analog hardware, particularly in the IF and baseband sections, with digital circuitry. Nevertheless, it helps to understand the fundamental principles of operation by looking at the architecture this way.

1. Shown here is a block diagram of a traditional swept spectrum analyzer. Modern analyzers replace much of this analog hardware with digital circuitry (see Figure 6).

The analyzer’s mixer is a three-port device that converts the input signal from one frequency to another. The input signal is applied to one port, and the local oscillator’s (LO) output signal is applied to the other. The mixer is a nonlinear device, so frequencies will be present at the output that weren’t present at the inputs. These frequencies are the original input signals, plus the sum and difference frequencies of the two signals. The difference frequency is called the IF signal.

The analyzer’s IF filter is a bandpass filter used as a “window” for detecting signals. Its bandwidth, the analyzer’s resolution bandwidth (RBW), can be changed by means of the instrument’s front panel. A broad range of variable RBW settings allows the analyzer to be optimized for different sweep and signal conditions and enables the user to trade off frequency selectivity, signal-to-noise ratio (SNR), and measurement speed. Narrowing RBW, for example, improves selectivity and SNR. However, sweep speed and trace update rate degrade. The optimum RBW setting depends heavily on the characteristics of the signals of interest.

The detector allows the analyzer’s IF signal to be converted to a baseband or video signal so it can be digitized and viewed on the LCD. This is accomplished with an envelope detector whose video output is digitized with an analog-to-digital converter (ADC) and represented as the signal’s amplitude on the Y-axis of the analyzer display.

Several different detector modes dramatically affect how the signal is displayed. In positive detection mode, typically used when analyzing sinusoids, the peak value of the signal is displayed over the duration of one trace display point, sometimes called a “display bucket” or “bin.” In negative detection mode, the minimum value is displayed. In sample detection mode, the amplitude value at the midpoint of the time interval for each bin is displayed.

When displaying both signals and noise, the best detection mode is the normal (Rosenfell) mode. This “smart” mode dynamically changes depending on the input signal. If the signal both rises and falls within a display bucket, it is assumed to be noise, and positive and negative detection are alternately employed. If the signal continues to rise, a signal is assumed and positive peak detection is used.

Several processes can be used to smooth variations in the envelope-detected amplitude—namely, average detection and video filtering. Average detection uses all data values collected within the time interval of a bin and is useful when measuring noise or noise-like signals found in digital modulation schemes. When average detection is used to average power (such as when measuring the power of complex signals), it is generally called a root-mean-square (RMS) detector.

The video filter is a low-pass filter located after the envelope detector and before the ADC. Video filtering determines the bandwidth of the video amplifier and is used to average or smooth the trace seen on the screen. By changing the video bandwidth (VBW) setting, the peak-to-peak variations of noise in the spectrum analyzer can be decreased, making signals otherwise obscured by noise easier to find.

Video averaging and trace averaging also can be used to smooth variations in the envelope-detected amplitude. With video averaging, when the video filter’s cutoff frequency is reduced, the video system no longer follows the more rapid variations of the envelope of the signal passing through the IF chain. The amount of smoothing that takes place is determined by the ratio of VBW to RBW. Ratios of 0.01 or less provide good smoothing.

Trace averaging averages over two or more sweeps on a point-by-point basis. At each display point, the new value is averaged in with the previously averaged data. The display gradually converges to an average over a number of sweeps. Trace averaging does not appreciably affect the sweep time.

Understanding Analyzer Specifications

It is important to understand the spectrum analyzer’s specifications as they communicate the level of performance expected from a particular instrument. Specifications are helpful in predicting how an analyzer will perform in a specific measurement situation, as well as the accuracy of its results. Key spectrum analyzer specifications include:

  • Frequency range: Frequency range specifies the range of frequencies over which the analyzer will operate. It is important to find a spectrum analyzer that will cover the fundamental frequencies of your application, but also harmonics or spurious signals on the high end, or baseband, and IF on the low end.
  • Frequency accuracy: Frequency accuracy is often labeled as “frequency readout accuracy” and is usually specified as the sum of several sources of errors, including frequency-reference uncertainty, span error, and RBW center-frequency error. Virtually every modern spectrum analyzer uses synthesized local oscillators, which are generally accurate to within a few hundred Hertz.
  • Amplitude accuracy: Uncertainty in display fidelity and frequency response has a direct effect on an analyzer’s amplitude accuracy. The further a signal is from the reference level, the more the display fidelity will play a factor. To reduce the uncertainty, we must bring the measured signal up to the reference level, making the displayed amplitude of the signal the same as the calibrator. However, this introduces a new error in the measurement, namely IF-gain uncertainty. There are two options for handling this. In one case, we leave the signal in its place on screen, and the display fidelity error is simply accepted. In the other, we move the signal to the reference level, which causes a reference-level switching error. The best option depends on the spectrum analyzer. Some spectrum analyzers have larger display fidelity errors, while others have larger IF-gain uncertainties. 
  • Resolution: A spectrum analyzer’s resolution determines its ability to resolve signals of equal amplitude. A wider RBW may make two signals appear as one. In general, two equal-amplitude signals can be resolved if their separation is greater than or equal to the 3-dB bandwidth of the selected RBW filter. Generally, though, engineers look at signals of unequal amplitudes. Because both signals trace out the filter shape, it is possible to bury a smaller signal under the filter skirt of the larger one. The greater the amplitude difference, the more a lower signal is buried under the skirt of its neighbor’s response.

In one example of a two-tone test, signals are separated by 10 kHz (Fig. 2). With a 10-kHz RBW, resolution of equal-amplitude tones is not a problem, but it may bury distortion products. This is the case for a 3-kHz RBW with a selectivity of 15:1. Here, the required RBW for the measurement is 1 kHz and the two signals unequal in amplitude by 60 dB must be separated by at least one-half the 60-dB bandwidth to resolve the smaller signal.

2. The greater the amplitude difference, the more a smaller signal is buried under the skirt of its larger neighbor’s response.

RBW selectivity (shape factor) and phase noise are important characteristics for determining the resolvability of signals of unequal amplitudes. One way to improve the spectrum analyzer’s resolution is to narrow the RBW, but it then takes longer to sweep across the spectrum because the RBW filters require a finite time to fully respond. Spectrum analyzers have auto-coupled sweep time that automatically chooses the fastest allowable sweep time based on selected span, RBW, and VBW. The analyzer type also affects sweep speed.

  • Sensitivity: A receiver’s sensitivity is an indication of how well it measures small signals. All receivers (including those within spectrum analyzers) add some internally generated noise. Spectrum analyzers usually characterize this by specifying the displayed average noise level (DANL) in dBm, with the smallest RBW setting. DANL is another term for the instrument’s noise floor given a particular bandwidth and represents its best-case sensitivity. It is impossible to measure input signals below this noise level. Generally, sensitivity ranges from –135 dBm to –165 dBm. To achieve optimum sensitivity, try using the narrowest RBW possible, sufficient averaging, a minimum RF-input attenuation, and/or a preamplifier, although increasing sensitivity may conflict with other measurement requirements, such as minimizing distortion or maximizing dynamic range.
  • Distortion: Although distortion measurements such as third-order intermodulation and harmonic distortion are common measurements for characterizing devices, the spectrum analyzer itself also produces distortion products that may disturb measurements. If this internal distortion is comparable to the external distortion of the device-under-test (DUT) being measured, measurement errors can arise. In the worst-case scenario, it can completely cover up a device’s distortion products. The manufacturer may specify the analyzer’s distortion performance directly, or it may be lumped into a dynamic-range specification.

The trick is to determine whether the distortion caused by the analyzer will affect a measurement. Suppose, for example, that a test specifies two-tone distortion products (third-order products) of more than 50 dB and second-order (harmonic) distortion of more than 40 dB below the fundamental. These values serve as the minimum levels necessary for the analyzer specifications. To reduce measurement error caused by the presence of internal distortion, this distortion would have to be much lower than the test specifications. 

We may depict the distortion behavior for any nonlinear device, whether generated internally or generated by a DUT (Fig. 3). To determine if distortion is internally generated or caused by the DUT, an attenuator test can be performed. First, change the RF input attenuation by 10 dB. If the distortion product on the screen does not change amplitude, it is from the DUT. If the signal on the screen changes, the distortion is in part from somewhere inside the signal analyzer, and not totally from the DUT. 

3. Second-order distortion increases as the square of the fundamental signal amplitude. Third-order distortion increases as the cube of the fundamental amplitude. The third-order intercept (TOI) is a measure of the analyzer’s ability to handle large signals without distortion. Analyzers with higher TOI figures generally have improved dynamic range and accuracy for distortion and noise measurements.
  • Dynamic range: Dynamic range is traditionally defined as the ratio, in dB, of the largest to the smallest signals simultaneously present at the input of the spectrum analyzer that allows measurement of the smaller to a given degree of uncertainty (Fig. 4). An instrument’s dynamic range determines the amplitude range over which measurements can be reliably made. Dynamic range is often misunderstood and misinterpreted, because the instrument’s display range, measurement range, noise floor, phase noise, and spurious response all play important roles in its determination (Fig. 5).

4. This diagram graphically represents dynamic range. Here, the signal-to-noise and signal-to-distortion curves are both on one dynamic range graph. Maximum dynamic range occurs where the curves intersect, that is, when the internally generated distortion level equals the displayed average noise level (DANL). This point is also the optimum mixer level.

5. Dynamic range is often misunderstood and misinterpreted because the instrument’s display range, measurement range, noise floor, phase noise, and spurious response all play important roles in its determination. It is critical to know which definition is most appropriate for a given application.

To optimize dynamic range, choose the analyzer settings that provide optimum sensitivity: the narrowest RBW, minimal input attenuation, and sufficient averaging. Also, test the analyzer for distortion by increasing input attenuation and looking for signal-amplitude changes. Then, set the attenuator to the lowest setting without amplitude change.

Modern Signal Analyzers

In contrast to traditional spectrum analyzers, modern variants have emerged to accommodate digitally modulated radio systems, which require new types of tests (such as channel power tests and demodulation measurements). With their multi-functionality, support for a broad range of standards and characteristics such as excellent amplitude accuracy, span and frequency accuracy, correction factors, and limit lines with margins and pass/fail indicators, modern spectrum analyzers are well equipped to perform these tests. Some even provide real-time signal acquisition to capture all relevant information about a signal as it varies in time.

In contrast to the traditional spectrum analyzer, the modern variant features different components, re-arranged blocks, and an ADC that is pushed further up the processing chain (Fig. 6). The analyzer’s all-digital IF can process signals in different ways and enables significant advances in accuracy, dynamic range, and speed. Digital signal processing (DSP) is also incorporated, which allows the analyzer to measure increasingly complex signal formats, improves dynamic range and accuracy, and increases sweep speed. It is possible to process signals via a swept-analysis mode when dynamic range is important, or in FFT-analysis mode when we require a faster sweep speed at narrow bandwidths.

6. Modern spectrum analyzers such as the Agilent X-Series signal analyzer feature different components, rearranged blocks, and an ADC pushed further up the processing chain compared to traditional spectrum analyzers.

Key features of the modern spectrum analyzer include its ability to make built-in, one-button power measurements (such as occupied bandwidth, channel power, and adjacent channel power), as well as its support for application-specific software enabling one-button measurements of general-purpose applications, flexible digital modulation analysis, and power/digital modulation measurements for wireless communications applications.

Enhanced display capabilities such as spectrograms enable analysis of time-variant signal spectrum, while trace zoom allows users to easily zoom in on trace data. I/Q baseband inputs help bridge the gap between baseband and RF environments (Fig. 7). Moreover, the wide analysis bandwidth offered by modern spectrum analyzers is ideal for signal analysis on aerospace and defense, emerging communications, and cellular communications applications, which can require bandwidths in the hundreds of megahertz with high data rates.

7. Agilent’s PXA and MXA analyzers offer optional I/Q baseband inputs with 500-Msample deep capture standard memory.

Summary

Signal analyzers are useful tools for characterizing and analyzing a variety of devices and systems. Using them effectively—for making accurate measurements and properly interpreting and analyzing results—requires a basic understanding of how they work and their characteristics. Find more information on Agilent spectrum and signal analyzers at www.agilent.com/find/sa.

References

  1. Vector Signal Analysis Basics (42 pages)
  2. Spectrum Analysis Basics (120 pages)
  3. Spectrum Analysis Application Notes

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!