8 Hints for Making Better Spectrum Analyzer Measurements

The spectrum analyzer, like an oscilloscope, is a basic tool used for observing signals. Where the oscilloscope provides a window into the time domain, the spectrum analyzer provides a window into the frequency domain. These hints will give you some way to make better spectrum analyzer measurements

This Application Note was written by Agilent (Copyright). To read the full Application Note used this link (agilent web site):

Hint 1. Selecting the Best Resolution Bandwidth (RBW)

The resolution bandwidth (RBW) setting must be considered when concerned with separating spectral components, setting an appropriate noise floor and demodulating a signal. When making demanding spectrum measurements, spectrum analyzers must be accurate, fast and have high dynamic range. In most cases, emphasis on one of these parameters adversely impacts the others. Oftentimes, these tradeoffs involve the RBW setting. One advantage of using a narrow RBW is seen when making measurements of low-level signals. When using a narrow RBW, the displayed average noise level (DANL) of the spectrum analyzer is lowered, increasing the dynamic range and improving the sensitivity of the spectrum analyzer

However, the narrowest RBW setting is not always ideal. For modulated signals, it is important to set the RBW wide enough to include the sidebands of the signal. Neglecting to do so will make the measurement very inaccurate. Also, a serious drawback of narrow RBW settings is in sweep speed. A wider RBW setting allows a faster sweep across a given span compared to a narrower RBW setting.

It is important to know the fundamental tradeoffs that are involved in RBW selection, for cases where the user knows which measurement parameter is most important to optimize. But in cases where measurement parameter tradeoffs cannot be avoided, the modern spectrum analyzer provides ways to soften or even remove the tradeoffs. By utilizing digital signal processing the spectrum analyzer provides for a more accurate measurement, while at the same time allowing faster measurements even when using narrow RBW.

Hint 2. Improving Measurement Accuracy

Before making any measurement, it is important to know that there are several techniques that can be used to improve both amplitude and frequency measurement accuracies. Available self-calibration routines will generate error coefficients (for example, amplitude changes versus resolution bandwidth), that the analyzer later uses to correct measured data, resulting in better amplitude measurements and providing you more freedom to change controls during the course of a measurement. Once the device under test (DUT) is connected to the calibrated analyzer the signal delivery network may degrade or alter the signal of interest, which must be canceled out of the measurement

One method of accomplishing this is to use the analyzer’s built-in amplitude correction function in conjunction with a signal source and a power meter. To cancel out unwanted effects, measure the attenuation or gain of the signal delivery network at the troublesome frequency points in the measurement range. Amplitude correction takes a list of frequency and amplitude pairs, linearly connects the points to make a correction “waveform,” and then offsets the input signal according to these corrections.

In the modern spectrum analyzer, you can also directly store different corrections for your antenna, cable and other equipment so calibration will not be necessary every time a setting is changed. One way to make more accurate frequency measurements is to use the frequency counter of a spectrum analyzer that eliminates many of the sources of frequency uncertainty, such as span. The frequency counter counts the zero crossings in the IF signal and offsets that count by the known frequency offsets from local oscillators in the rest of the conversion chain. Total measurement uncertainty involves adding up the different sources of uncertainty in the spectrum analyzer. If any controls can be left unchanged such as the RF attenuator setting, resolution bandwidth, or reference level, all uncertainties associated with changing these controls drop out, and the total measurement uncertainty is minimized. This exemplifies why it is important to know your analyzer. For example there is no added error when changing RBW in the high-performance spectrum analyzers that digitize the IF, whereas in others there is.

Hint 3. Optimize Sensitivity When Measuring Low-level Signals

A spectrum analyzer’s ability to measure low-level signals is limited by the noise generated inside the spectrum analyzer. This sensitivity to low-level signals is affected by the analyzer settings. To measure the low-level signal, the spectrum analyzer’s sensitivity must be improved by minimizing the input attenuator, narrowing down the resolution bandwidth (RBW) filter, and using a preamplifier. These techniques effectively lower the displayed average noise level (DANL), revealing the low-level signal. Increasing the input attenuator setting reduces the level of the signal at the input mixer. Because the spectrum analyzer’s noise is generated after the input attenuator, the attenuator setting affects the signal-to-noise ratio (SNR). If gain is coupled to the input attenuator to compensate for any attenuation changes, real signals remain stationary on the display. However, displayed noise level changes with IF gain, reflecting the change in SNR that result from any change in input attenuator setting. Therefore, to lower the DANL, input attenuation must be minimized. An amplifier at the mixer’s output then amplifies the attenuated signal to keep the signal peak at the same point on the analyzer’s display. In addition to amplifying the input signal, the noise present in the analyzer is amplified as well, raising the DANL of the spectrum analyzer. The re-amplified signal then passes through the RBW filter. By narrowing the width of the RBW filter, less noise energy is allowed to reach the envelope detector of the analyzer, lowering the DANL of the analyzer. To achieve maximum sensitivity, a preamplifier with low noise and high gain must be used. If the gain of the amplifier is high enough (the noise displayed on the analyzer increases by at least 10 dB when the preamplifier is connected), the noise floor of the preamplifier and analyzer combination is determined by the noise figure of the amplifier. In many situations, it is necessary to measure the spurious signals of the device under test to make sure that the signal carrier falls within a certain amplitude and frequency “mask”. Modern spectrum analyzers provide an electronic limit line capability that compares the trace data to a set of amplitude and frequency (or time) parameters. When the signal of interest falls within the limit line boundaries, a display indicating PASS MARGIN or PASS LIMIT (on Agilent analyzers) appears. If the signal should fall out of the limit line boundaries, FAIL MARGIN or FAIL LIMIT appears on the display

Hint 4. Optimize Dynamic Range When Measuring Distortion

An issue that comes up with measuring signals is the ability to distinguish the larger signal’s fundamental tone signals from the smaller distortion products. The maximum range that a spectrum analyzer can distinguish between signal and distortion, signal and noise, or signal and phase noise is specified as the spectrum analyzer’s dynamic range. When measuring signal and distortion, the mixer level dictates the dynamic range of the spectrum analyzer. The mixer level used to optimize dynamic range can be determined from the second-harmonic distortion, third-order intermodulation distortion, and displayed average noise level (DANL) specifications of the spectrum analyzer. From these specifications, a graph of internally generated distortion and noise versus mixer level can be made. However, since distortion is determined by the difference between fundamental and distortion product, the change is only 1 dB. Similarly, the third-order distortion is drawn with a slope of 2. For every 1 dB change in mixer level, 3rd order products change 3 dB, or 2 dB in a relative sense. The maximum 2nd and 3rd order dynamic range can be achieved by setting the mixer at the level where the 2nd and 3rd order distortions are equal to the noise floor, and these mixer levels are identified in the graph. To increase dynamic range, a narrower resolution bandwidth must be used. The dynamic range increases when the RBW setting is decreased from 10 kHz to 1 kHz. Note that the increase is 5 dB for 2nd order and 6+ dB for 3rd order distortion. Lastly, dynamic range for intermodulation distortion can be affected by the phase noise of the spectrum analyzer because the frequency spacing between the various spectral components (test tones and distortion products) is equal to the spacing between the test tones. For example, test tones separated by 10 kHz, using a 1 kHz resolution bandwidth sets the noise curve as shown. If the phase noise at a 10 kHz offset is only –80 dBc, then 80 dB becomes the ultimate limit of dynamic range for this measurement, instead of a maximum 88 dB dynamic range

Hint 5. Identifying Internal Distortion Products

High-level input signals may cause internal spectrum analyzer distortion products that could mask the real distortion on the input signal. Using dual traces and the analyzer’s RF attenuator, you can determine whether or not distortion generated within the analyzer has any effect on the measurement. To start, set the input attenuator so that the input signal level minus the attenuator setting is about –30 dBm. To identify these products, tune to the second harmonic of the input signal and set the input attenuator to 0 dBm. Next, save the screen data in Trace B, select Trace A as the active trace, and activate Marker Δ. The spectrum analyzer now shows the stored data in Trace B and the measured data in Trace A, while Marker Δ shows the amplitude and frequency difference between the two traces. Finally, increase the RF attenuation by 10 dB and compare the response in Trace A to the response in Trace B. If the responses in Trace A and Trace B differ, Then the analyzer’s mixer is generating internal distortion products due to the high level of the input signal. In this case, more attenuation is required. since there is no change in the signal level, the internally generated distortion has no effect on the measurement. The distortion that is displayed is present on the input signal.

Hint 6. Optimize Measurement Speed When Measuring Transients

Fast sweeps are important for capturing transient signals and minimizing test time. To optimize the spectrum analyzer performance for faster sweeps, the parameters that determine sweep time must be changed accordingly. Sweep time for a swept-tuned superheterodyne spectrum analyzer is approximated by the span divided by the square of the resolution bandwidth (RBW). Because of this, RBW settings largely dictate the sweep time. Narrower RBW filters translate to longer sweep times, which translate to a tradeoff between sweep speed and sensitivity. 10x change in RBW approximates to a 10 dB improvement in sensitivity. A good balance between time and sensitivity is to use fast fourier transform (FFT) that is available in the modern high-performance spectrum analyzers. By using FFT, the analyzer is able to capture the entire span in one measurement cycle. When using FFT analysis, sweep time is dictated by the frequency span instead of the RBW setting. Therefore, FFT mode proves shorter sweep times than the swept mode in narrow spans. The difference in speed is more pronounced when the RBW filter is narrow when measuring low-level signals. In the FFT mode, the sweep time for a 20 MHz span and 1 kHz RBW is 2.2 s compared to 24.11 s for the swept mode For much wider spans and wide RBW’s, swept mode is faster.

Hint 8. Measuring Burst Signals: Time Gated Spectrum Analysis

How do you analyze a signal that consists of a bursted (pulsed) RF carrier that carries modulation when pulsed on? If there is a problem, how do you separate the spectrum of the pulse from that of the modulation? Analyzing burst signals (pulses) with a spectrum analyzer is very challenging because in addition to displaying the information carried by the pulse, the analyzer displays the frequency content of the shape of the pulse (pulse envelope) as well. The sharp rise and fall times of the pulse envelope can create unwanted frequency components that add to the frequency content of the original signal. These unwanted frequency components might be so bad that they completely obscure the signal of interest.

The EDGE waveform modulation is almost completely hidden by the pulse spectrum. In a time gated measurement, the analyzer senses when the burst starts, then triggers a delay so the resolution filter has time to react to the sharp rise time of the pulse, and finally stops the analysis before the burst ends. By doing this, only the information carried by the pulse is analyzed, as is shown in Figure 24. It is now clear that our pulse contained a modulation. Other types of time-gating available in the modern high-performance spectrum analyzer are gated-video, gated-LO and gated-FFT. Gated-LO sweeps the local oscillator during part of the pulsed signal so several trace points can be recorded for each occurrence of the signal. Whereas gated-FFT takes an FFT of the digitized burst signal removing the effect of the pulse spectrum. Both provide advantages of increased speed.

© Agilent Technologies, Inc. 2012, Published in USA, September 7, 2009, 5965-7009E

Some material in this appendix is reproduced with permission from Agilent technologies
Reproduced with Permission, Courtesy of Keysight Technologies, Inc