to verify the "flatness" of a levelled sine signal source, the E9304A power sensor (w/ E4419B power meter) is considered as the meaurement tool. the measurment setup would be: 1. measure the power level at 1MHz, let's say the measured level is the reference, ie. 0.01dBm 2. increase the frequency to 50MHz, 100MHz, 200MHz...1GHz, record the measured power level respectively 3. compare each measured level with each frequecy set point against the reference level, then the flatness across the frequency range can be realized to evaluate the measuremnt uncertainty that contributed by the sensor, I would think of only 2 factors: one is sensor noise, the short-term drift of sensor reading (could be ignored), the other is frequency response of the sensor. because absolute accuarcy of the sensor is not concerned in this measurment, more important is how the sensor reading differs at different frequency set point as a contant power is applied, the published specifications doesn't say much about this, but I guess it is the Calibration Factor of the sensor head which is used to correct the frequency response, am I correct? I've reached to a local KS technical support however the outcome is not clear about my question and they suggest the answer is in the calibration report, I have the E9304A calibated at KS recently and the Calibration Factor is checked according to the reported items, but how do I derive the uncertainty from the report for my particular measurement, or do I use the Calibration Factor spec in the user manual off the shelf ?