Hi, all,

I'm doing a yield analysis. As a first attempt, I choose 'discrete type' in Tune/Opt/

On the other hand, in the 'Yield' controller, there's another parameter 'NumsIters'. I did a couple of trials for a test circuit with a capacitor as the only statistics-enabled component. For example, the discrete type setup is as follows: Cmin=72fF, Cmax=88fF, and 'step value=8fF'. So there's only 3 data points for the capacitor. While on the other hand, 'NumsIters' is set as 250. I wonder how this 250 trials work since only 3 data points available? Any sort of automatic data interpolation by the software, other than the user-specified 'step value'?

I'm doing a yield analysis. As a first attempt, I choose 'discrete type' in Tune/Opt/

**Stat**/DOE Setup. A parameter 'step value' is there.On the other hand, in the 'Yield' controller, there's another parameter 'NumsIters'. I did a couple of trials for a test circuit with a capacitor as the only statistics-enabled component. For example, the discrete type setup is as follows: Cmin=72fF, Cmax=88fF, and 'step value=8fF'. So there's only 3 data points for the capacitor. While on the other hand, 'NumsIters' is set as 250. I wonder how this 250 trials work since only 3 data points available? Any sort of automatic data interpolation by the software, other than the user-specified 'step value'?

I have a further question about the discrete-type yield analysis. Suppose I have three components with statistics enabled, say, e.g.,

1. Inductor (L1), 3 data points;

2. Capacitor (C1), 5 data points;

3. Capacitor (C2), 10 data points;

And suppose the number of iterations ('NumsIters') are 50, so the simulator will randomly pick 50 combinations out of 150 (i.e., 3x5x10) possibilities, right?

Best Regards,

yanyujin

I tried the yield simulation (no matter discrete-type or gaussian-type sample, small or large sample) at different time, it always gives different results (e.g., histogram), although it does not change the trend significantly. Then in practice, what's the confidence level of this yield analysis? Shall we also do the time average for a robust result?

Best Regards,

yanyujin

There all kinds of guidelines for setting this number in the literature and the ADS manual has some too (you may want to check this out). As a crude way of setting this number try a number you feel confortable with based on the expectation of what the yield will be aproximately, if you have no feel for it just run a baseline simulation check. Once you know the estimated Yield lets say you allow for 32 errors to make it a statistical meaningful sample of errors. If the estimated yield is 95% then run at least 32/(1-Yield)=32/(1-0.95)=640 samples.

While this is not very scientific based on experience it works well especially if the simulation runs slow due to circuit complexity. However if you have a fast running simulation thousands of samples is truly the way to go.

Also keep in mind you can always increase the number of samples to get even better accuracy from the initial estimate. Personally I like to run a second analsis in which I double the number of iterations, if the yield moves very little that means the lower number of iterations had already converged on an accurate yield number. If the yield changes significantly then I will double again and check. I keep repeating until I am sure the number is not changing significantly, I usually hit the mark by the second or third attempt.