In AN 150-4, the following are stated when measuring the C/N of a CW signal.
Only the 10 kHz RBW is calibrated, and the actual BW is 1.2 x 10 = 12 kHz
(Agilent) SAnalyzers use envelope detectors rather than RMS ones. This introduces an error of 1.05 dB.
Logarithmic shaping reduces the actual level of the measured noise.
The combined effect of error introduced by the envelop detector and log shaping results in a 2.5 dB error in the measured noise level. The actual noise level is 2.5 dB higher than shown on the Analyzer.
Do these conditions still apply to current Agilent SAs ?
Only the 10 kHz RBW is calibrated, and the actual BW is 1.2 x 10 = 12 kHz
(Agilent) SAnalyzers use envelope detectors rather than RMS ones. This introduces an error of 1.05 dB.
Logarithmic shaping reduces the actual level of the measured noise.
The combined effect of error introduced by the envelop detector and log shaping results in a 2.5 dB error in the measured noise level. The actual noise level is 2.5 dB higher than shown on the Analyzer.
Do these conditions still apply to current Agilent SAs ?
User xmo is correct.
In the X-Series analyzers (the current family of analyzers offered by Agilent: http://www.agilent.com/find/x-series), the RBW's are calibrated according to the specifications guide for the instrument. It is not just the 10 kHz RBW.
For example, the N9020A MXA specifications guide (http://www.agilent.com/find/mxa_specifications) states the following for resolution bandwidth specs-
From 1 Hz to 750 kHz, the RBW's are measured for all center frequencies (CF's) and are guaranteed under warranty to have a 3 dB bandwidth accuracy within +/- 1.0%.
From 820 kHz to 1.2 MHz, the RBW's are measured for all CF's up to 3.6 GHz only and are guaranteed under warranty to have a 3 dB BW accuracy within +/- 2.0%.
Above 1.2 MHz RBW, the specifications are not under warranty and are not measured for every instrument. We have done some measurements, however, to give expected performance for the bandwidth accuracy. The results indicate the expected accuracy to be better than +/- 0.25 dB.
The 1.2x correction that you are referring to is not so much the "actual" bandwidth as it is the "noise power" bandwidth as defined in AN 150-4. This is a definition that still holds for noise-based measurements with current analyzers.
Let me refer you to the application note Spectrum and Signal Analyzer Measurements and Noise, http://cp.literature.agilent.com/litweb/pdf/5966-4008E.pdf. This app note discusses the measurements you wish to make and the enhancements offered by the X-Series analyzers. For noise power measurements, refer to page 9. It specifically states that most analyzers have a noise marker to account for the difference in definition between the RBW and noise power bandwidth, as well as the differences due to the detectors and averaging used (i.e. the offset errors due to envelope/voltage detection and log averaging). In fact the X-Series analyzers have a noise marker which will account for all of those differences and report the actual noise power in dBm/Hz (normalized to 1 Hz, as you can see). For this type of measurement, it is best to use a sample or average detector because peak detection will inaccurately bias the results. The "marker noise" function in the X-Series analyzers (under the Marker Function hard key) will automatically change the detector to average when the function is selected unless the user manually overrides the detector choice.
With regards to the effects of envelope detection and log averaging, there have been many advances in spectrum analysis technology that have made it much easier to avoid the need for manual corrections to the results displayed by the analyzer. For example, the envelope detector (and RBW filters are/) is now applied digitally, which allows us to give the user the ability to choose the type of detection which needs to be done. One page 15 of the app note, it states that in the X-Series analyzers, we can measure on an average power (RMS) scale, without the effects of the envelope detector or log averaging.
To do this, we simply set the detector to average. Then the X-Series analyzers have an option for a user-determined average type, under the Meas Setup hard key. The user can select whether the results are displayed on a linear (voltage), power (RMS), or logarithmic averaging scale.
Note that since the video filter is also implemented digitally, the video filter is automatically coupled with the averaging type that is selected by the user. This ensures that the video filter does not incorrectly bias the results.
Another way to perform averaging is to select the sample detector and use trace averaging. The trace averaging feature is also coupled to the averaging mode that is selected by the user.
Please let me know if I can be of additional assistance.
Best Regards,
Scott