AnsweredAssumed Answered

Problems with GainRF compression settings

Question asked by anna.miskiewicz on Jan 11, 2008
Latest reply on Jan 11, 2008 by bafisher
Hello All,

I'm doing the following simulations:
I have to determine the time of measurement on a testchip to obtain S/N within ranges defined by max. acceptable error (let's say 0.5dB).
The setup is like this:
at the input there is one N_Tones block with Pin [dBm] power level at certain frequency, then I add noise AddNDensity and filter the signals with passband filter BPF_ButterworthTimed.
Then there is some sort of AGC in form of: GainRF (no compression) and the signal enters the ADC_Timed.
After the ADC I save the signal in Timed Sink and calculate the S/N on DataDisplay.
The problem I encounter is following:
although none of the blocks seem to have any compression point (deactivated for GainRF) I have the effect of compression when the signal goes over some defined level (constant for different simulation times).
In my opinion it suggests some type of compression, but I cannot define which block could it be.
Do you have any ideas where the problem is or do you have any idea how I could simulate the problem of measurement time?  

Outcomes