Is there any detailed technical information available on how the 3070 tests capacitors, e.g. what the 'parallel' and 'serial' models are (and hence their differences), and what phase measurements are taken and how these are used to deduce the actual capacitance?
I have a tricky arrangement as a result of a design modification - a cap in series with a resistor, both in parallel with a second resistor, of necessity all tested as a capacitor - which mostly works, but occasionally I get unexplained behaviour. To try and help me diagnose my problem I turned to the phase and magnitude readings shown using 'display MOA', but they neither seemed to make much sense, nor helped me understand and/or cure my problem. Thus I decided to investigate further, and I found a location on a board which is completely isolated from any other circuitry (so as to eliminate any 'outside interference') and placed a cap there to see if I could understand what the 3070 does. What I found was:
1. The measurements of the capacitor are way outside the specified accuracy: the best I got with a 10uF cap, using either model, was 15% from the nominal value, as opposed to the spec value of 2% (its a 10% part); even performing the calculations using the cap value as measured on a DVM, the best accuracy is still only about 12%.
2. The phase value shown in 'display MOA' is clearly after the phase has been 'processed' in some way, as adding a 270ohm resistor in parallel to the 10u cap (so the resistance and cap reactance are of the same order of magnitude at 128Hz) still results in a phase reading of -90 degrees!? [edit: use 'en' to see true phase] Thus without knowing what this phase processing is, so that I can account for it in the MOA readings, to my mind means the display is almost worthless.
(The machine is regularly calibrated, and running 'autoadjust' didn't make any significant difference.)
Any help or input appreciated!