AnsweredAssumed Answered


Question asked by miyamky on May 14, 2012
Latest reply on May 17, 2012 by tabbott
During a recent code review of a colleague's code, I noticed that for signals at frequencies > 3.2GHz and power > -60dBm, they issued a PRESEL CENTER command and re-read.  I asked them about it (since my organization isn't doing it and it slows down reading time significantly) and they indicated that Agilent told them that this was a required setting (under those conditions).  So I set my sig gen at 5GHz at a power of -50dBm and measured.  Then I hit the PRESEL CENTER command and noticed a -4.6MHz adjustment and a higher reading by about .5dBm.  It's interesting that the adjustment at 18GHz was only in the kHz off.

So my question is why is this setting optional and how can I find more commands/settings that have additional requirements on them to be accurate?  Also, can someone please explain this command more clearly than the documentation?  I'll need to explain to my design engineers and upper management that the data we've been taking for the past few years is not necessarily accurate.