I need to measure the energy capacity of some AA batteries using a constant resistor source of 100 ohm as the load and Ill just be measuring votlage to be taken every second until the AA battery reaches 800mV, then I need the meter to stop taking readings.
I need to know the following:
1) Do I need the DIG upgrade?
2) If so, can I set up the meter trigger event such that at 800mV it will terminate taking anymore readings?
3) What is meant exactly by readings/sec when logging data from a source? Does that mean it reads for one second then turns off for one second or is it reading for one second and then immediately tries to read another second with as little deadtime as possible between the data reads? (not 50% duty cycle, on read time/no read time).
4) If it does read for one second as default, and say for example that the voltage goes from 1.5040 volts to 1.4980 volts as battery is drain by the resistor, then what number does it put in the log? Is it average, min, max, or integration?
I think I confused people on point number 3. What I am trying to say is if I for example set my meter to 10 readings per second, then thats one reading in a 1/10 th second, but there will be dead time before the instrument can take another reading because the analog data has to be digitized by the ADC, put into a memory buffer, then displayed to the screen, all of which adds delays before another reading can be taken. This means some of the analog data will be missed.