We are using the N9020A to calibrate and test the power output of signal sources and noticed an odd phenomena - when we test the signal source with the same spectrum that was used to calibrate it we get results centered around the nominal value as expected, however when a different spectrum is used to test the source the results are offset by ~0.5-0.7 dB (after compensating for line losses between the source and the spectrum.
Any idea what could cause this?
Any idea what could cause this?
Very likely, what you are seeing is the difference in absolute amplitude accuracy between the two MXAs. It is difficult to say whether what you are seeing is within specification, or not, without knowing the nominal amplitude and frequency of the signals being measured. For example, if you are measuring a -20 dBm signal at 50 MHz, the absolute amplitude specification for the N9020A over a 20 to 30 degC range is +/- 0.33 dB. This means that two analyzers measuring this same 50 MHz signal could theoretically show a difference of as much as 0.66 dB and both be within specification. But to have two analyzers at such extremes is highly unlikely.
However, if this measurement were being made at 5 GHz, the absolute amplitude accuracy spec would be +/-0.33 dB + 1.5 dB for a total of +/-1.83 dB. Two MXAs measuring this same signal could show a difference of as much as +/-3.66 dB. To see a difference of between 0.5 dB and 0.7 dB at this frequency is entirely reasonable.
Regards -