AnsweredAssumed Answered

Input Impedance Calibration

Question asked by markyo on May 1, 2009
Latest reply on May 3, 2009 by markyo
I have a question which may be straight forward but would appreciate your comments.
I need to do a relatively broadband  transmission measurement (1MHz - 250MHz) on a device whose input port impedance is not even close to 50ohms. It looks like 150pF up to around 50MHz and then is a mess thereafter.
The output port isn't a problem and is a good 50ohm match.
So to do this transmission measurement I want to calibrate out the device's input impedance to ensure that if you were to measure the voltage amplitude and phase at the device input during the frequency sweep it would remain constant.

I have several ideas, but not being an expert I dont want to re-invent the wheel.  What is the best set-up and calibration procedure to achieve this ?
I am currently using an old HP 8753d, but am open to buying new kit to solve this if required.