Can anyone point me to any scholarly articles on the effect of a non-standard (ie: not equal to 50+/-j0) load and its effects on the resulting calibration process for vector network analyzers? How does a non-standard load skew the resulting calibration results. I know if you go thru the math related to the signal flow graphs one may determine the results, however I'm not up to solving these equations right now and am wondering if anyone has written a paper or two describing the effects.

Regards,

Karin

Regards,

Karin

I cover the cases in section 3.9 of my book , where you will discover that, if all other errors are small, the error in the load also becomes the error in the source match, and the error in the load match. This effects return loss as described in equation 3.86 where load error becomes EDF ~R~ and ESF ~R~, ERF being small. It will effect transmission uncertainty as described in equaition 3.99 through 3.101. In this case the only errors are ESF ~R~ and ELF ~R~

The EDF ~R~ is a measure of the error in the load, it is the linear return loss of the load, so that if you had a 45 ohm load on a 50 ohm system, your error would be

EDF ~R~ =|(45-50)/(45+50)|=0.052 linear or 26 dB residual load match.

Note that in general the load is complex and the error will often be larger due to the parasitic L or C, rather than the error in the real (DC) value..

REF: Handbook of Microwave Component Measurements, http://www.wiley.com/WileyCDA/WileyTitle/productCd-1119979552.html