AnsweredAssumed Answered

Tracking Errors and Calibration Methods

Question asked by SOLT_guy on May 5, 2011
Latest reply on May 8, 2011 by SOLT_guy
My network analyzer manual defines tracking error as:

"the vector sum of all test setup variations in which magnitude and phase change as a function of frequency"

A response calibration normalizes all measurements with respect to a reference measurement.  Can you explain how the "division" operation with respect to reference trace measurement has the operative effect of decreasing magnitude and phase change as a function of frequency?

Can you give me numerical example at a single frequency of how a response cal can correct for tracking error - magnitude and phase?

With respect to a one port calibration:

How can the tracking error terms alone (in isolation), e01 and e10, be directly measured and calculated?    If the calculation is associated with a  group delay term, please show the relation between group delay and e01 and e10.  

Outcomes