Hi,
I'm trying to figure out the measurement uncertainty for the 53230A Universal Frequency Counter/Timer
The Data Sheet at:
http://www.home.agilent.com/agilent/redirector.jspx?action=obs&nid=959910.3.00&lc=eng&cc=US&ckey=1942617&pubno=5990-6283EN<ype=LitStation&ctype=AGILENT_EDITORIAL&ml=eng
Has a complicated spec starting on page 15. The specification is quite difficult, fortunately, there's an example worked out on page 19. The only problem is, that is seems very, very wrong. I was trying to estimate the frequency error for a 100MHz signal. If I use that example, it says the 90% confidence interval is an error of +/- 0.1176 s. So, for a 100 MHz signal that puts my signal period between -0.1176 and +0.1176 seconds. In frequency that says my signal is between -8.503 and 8.503 Hz.
It looks like the timebase uncertainty is computed incorrectly. I think it should be (1/10MHz)*1.7ppm which is 0.17 ps. That would shift the basic accuracy at 90% confidence interval to +/- 50.94 ps. In frequency that puts the interval between 99.493181 MHz and 100.512008 MHz. Now, this is closer, but still puts my error at 5120 ppm. Is this correct? I thought I should be able to get within a few ppm.
Which counter can I use to get less than 1ppm measurement uncertainty at 100MHz? I'll use a rock solid reference oscillator.
Thanks,
Steve
I'm trying to figure out the measurement uncertainty for the 53230A Universal Frequency Counter/Timer
The Data Sheet at:
http://www.home.agilent.com/agilent/redirector.jspx?action=obs&nid=959910.3.00&lc=eng&cc=US&ckey=1942617&pubno=5990-6283EN<ype=LitStation&ctype=AGILENT_EDITORIAL&ml=eng
Has a complicated spec starting on page 15. The specification is quite difficult, fortunately, there's an example worked out on page 19. The only problem is, that is seems very, very wrong. I was trying to estimate the frequency error for a 100MHz signal. If I use that example, it says the 90% confidence interval is an error of +/- 0.1176 s. So, for a 100 MHz signal that puts my signal period between -0.1176 and +0.1176 seconds. In frequency that says my signal is between -8.503 and 8.503 Hz.
It looks like the timebase uncertainty is computed incorrectly. I think it should be (1/10MHz)*1.7ppm which is 0.17 ps. That would shift the basic accuracy at 90% confidence interval to +/- 50.94 ps. In frequency that puts the interval between 99.493181 MHz and 100.512008 MHz. Now, this is closer, but still puts my error at 5120 ppm. Is this correct? I thought I should be able to get within a few ppm.
Which counter can I use to get less than 1ppm measurement uncertainty at 100MHz? I'll use a rock solid reference oscillator.
Thanks,
Steve
By the way, I didn't see any reply to my message nor any revision notes on the data sheet. That makes me nervous since that seems to be the only place the accuracy specs are published. It seems Agilent can change the specs anytime without letting the customers know about it. Perhaps some notes about revisions would be helpful.
Thanks,
Steve