AnsweredAssumed Answered

How does calibration take into account losses of transmission line? And what is the meaning of parameters of standarts?

Question asked by Pavel_007 on Oct 1, 2017
Latest reply on Oct 3, 2017 by Pavel_007

Hello!

The model is used is ENA Series Network Analyzer E5061B (300 kHz - 3 GHz).

First of all, I am interested in some details about changing the Calibration Kit Definition.

I've read all about it in the user's guide and haven't answered all my question, at least I don't have full understanding.

 

1) I can't figure out whether or not calibration take into account the losses from the coaxial cables, connecting ports to the DUT or in the case of calibration to the standard. So I usually execute calibration with pair of coaxial cable, each about 1 meter, but sometimes I have to replace some of them on a little bit longer one. In this case, I also do full 2-**** calibration with new cable, but there are something which remains confusing to me. Will calibration be wrong or differ in some way? Will it affect my measurements? Do I need to change something to be able to execute calibration correctly?

             I have some thoughts on the mind, I hope it helps you to give a good answer to me. Let's suppose that calibration won't be wrong or won't differ. Then what do we have? We refused one of cables, which bring in additional losses, and now its affect the measurements. If we  measure S21, the signal, coming through the longer cable will be attenuate more. We'll get S21 less than before removing shorter 1-meter cable and same about other S-parameter.

            Before Network vector analyzer comes out, you basically had to calculate S-parameters by yourself, so you easily could count in losses of the cable. To do that you needed to measure its energy losses before. Actually, I did that once in the university. Now network analyzer calculate the S-parameters and no straightforward way to change this procedure somehow. However, it seems necessarily to me, that losses are taken into account in some way. Otherwise it is impossible to get correct measure.

 

2) Can you please explain what does parameters for Standards really means? For example, take a look on reflection standard model which provided in the User's manual:

Here there is such parameter as "t delay". It's says in Guide: "The delay occurs depending on the length of the transmission line between the standard to be defined and the actual measurement plane". But actually length of transmission line, which is used, (it must be basically the length of cable before the standard, i think) don't depend on the standards. I mean, It can be used transmission line of any length in calibration, but "t delay" is the same for any case. It seems to me, that special calibration cable must be provided as part of calibration kit.

            Most likely I get something wrong, but truly speaking that's why I am asking here.

            There are else a couple of points. For example, no offset delay for the STD load in the Pre-defined Calibration kits but there are in short and open. On the picture there is parameter RL, but no such parameter elsewhere. It is a little bit strange have as parameters L and C except R.

Outcomes