AnsweredAssumed Answered

Do modest phase errors of cal standards effect scalar measurements?

Question asked by drkirkby on Mar 26, 2013
Latest reply on May 23, 2013 by kenwong
If one knows the properties of a open standard *reasonably* well (say capacitance +/- 10%), would use of such standards cause an error in  |S11| or |S21| ? I read somewhere that scalar measurements would not be effected by inaccuracies in the standards, although I am sure there are limits to that. 

I need to make a return loss measurement of a device with a female N on it 10.368 GHz, but I only have an N cal kit to 6 GHz.. The polynomial describing the fringing capacitance indicates the fringing capacitance is 119 fF at DC and 121 fF at 6 GHz. 

I was thinking of just assuming the properties at 10.368 GHz were the same as those at 6 GHz, so defining a user cal kit with:

C0=121, C1=0, C2=0, C3=0

I very much doubt the error will be as much as 10% doing this, so I'm wondering what (if any) effect this would have on a return loss measurement. 

Buying an 85054B cal kit is just not an option financially. Calibrating wtih a 3.5 mm or APC-7 connector and using an adapter is an option, but I don't know if that would be worst than just using sub-optimal standards.