I'm using a 33600A Series waveform generator to generate a carrier frequency modulated with a gaussian noise (in frequency).
There is two parameters attached to this gaussian noise, the frequency deviation and the bandwidth. I'm not completely certain to understand the meaning of those terms and can not find a clear mathematical definition of this gaussian noise in the docs. And for my application I need to know exactly how this is defined.
I need to have a 'real' gaussian noise. By 'real' I mean that the RF spectrum is gaussian even if I sample it during a very short time window (typically 100 fs).
I have the feeling that the arbitrary function generator is not acting this way, but pick one frequency in a gaussian distribution (with a width related to the frequency deviation) and switch to another one after short time (related to the bandwidth). As a consequence the RF spectrum is gaussian only if integrated over a sufficiently long time.
Any idea ?