With a CXA N9000A, I'm measuring a narrow 30-MHz signal with a strength of ~3mV. I attached two videos of the signal on a linear scale. The videos were identical conditions for everything except the frequency span, which is 10 Hz in one and 60 MHz in the other.
I'm a little confused trying to interpret the behavior. To me it looks like the noise level is about 100uV or less in both. With 10 Hz span, the signal stays very constant, within about 100uV, as I would expect. This continues over a much longer time than shown here. But with the 60 MHz span, the signal jumps around wildly between 2 and 4 mW. I don't understand why it should be going so crazy, when the SNR looks very good, and when the more sensitive measurement shows that the signal itself is not changing.
I know this is a vague question, but I'm hoping you might have some ideas. Does this kind of behavior make sense to you? What context might make it make sense?
As far as I know, the signal should be a fairly simple sine wave with maybe 3 components at something like 30, 50, and 70 MHz.
Any ideas or specific questions that might help me get to the bottom of this?