AnsweredAssumed Answered

Measuring peak of large signal at 20mV/Div setting

Question asked by RonWierckx on Nov 28, 2008
Latest reply on Dec 3, 2008 by RonWierckx
I am measuring the output of a D/A converter (AD669AR) using an Agilent DSO8104A.  The D/A was calibrated for offset and gain using DC values, and are confirmed correct when using a DC meter and the scope.  I am examining the peak of the waves being created by the D/A; I do this by selecting 20mV/div and offsetting the channel by 10 volts.  This allows me to see the uppermost part of the wave. When the D/A is fed a 60Hz square wave or triangle wave, the output as measured on the scope is correct: it peaks at +10.00 volts.  Same if I examine the bottom part of the signal.  If I feed the D/A a sine wave of the same magnitude, the output is down by 40mV.  Increasing the frequency drops the output more, and decreasing the frequency makes it rise.  As I approach DC, it comes up to +10.00 volts.

My first thought is the D/A converter, but I measured the digital inputs and they are correct for all the types of waves.  It is almost like there is a low pass filter on the output of the D/A converter, even though I am measuring the pin, which goes directly to a test point.  There are no passives or anything from the pin to the testpoint.  I've tried different boards (of the same type)  as well as different boards (of a different type) with the same D/A converter.

Am I using the scope incorrectly?  Is it a valid measurement to offset the input by 10V and have the scope set to a small V/div?  We also have a leCroy DSO, and it also shows an incorrect value, but this time it's about 100mV above what it should be (for the same signal the Agilent shows 40mV below).  Again, bringing the wave down from 60Hz to DC makes the 2 scopes converge on the correct value.

I'm at wits end.  Some people are thinking it's the scope, some the D/A, some the measuring method.