Hi,guys,I found a strange thing, I have DSO 90254A(2.5GHZ,20GHZ). When I switched the sampling rate from 10GHZ to 20GHZ, the voltages range on the screen suddenly decreased largely to smaller ones(the scale per div was the same), I don't why, that's really strange, does that means 20GHZ sampling rate is unreliable? Please just tell me why, thank you!
I'm not sure I follow you. Do you have a waveform on screen that decreases in amplitude when you decrease the sampling rate to 10 GSa/s? If so, what is the data rate of the signal you are viewing? If you could post before and after screenshots that would be very helpful.