Originally posted Sept 10, 2014
Is technology repeating itself or just rhyming?
“History does not repeat itself, but it rhymes” is one of the most popular quotes attributed to Mark Twain. Though there is no clear evidence that he ever said this, it certainly feels like one of his. It says so much in a few words, and reflects his fascination with history and the behavior of people and institutions.
Recently, history rhymed for me while making some OFDM demodulation measurements and looking at the spectrum of the error vector signal. It brought to mind the first time I looked beyond simple error vector magnitude (EVM) measurements to the full error vector signal and understood the extra insight it could provide in both spectrum and time-domain forms.
The rhyme in the error vector measurements—as residual error or distortion measurements—took me all the way back to the first distortion measurements I made with a simple analog distortion analyzer. Variations on that method are still used today, and the approach is summarized below.
A simple distortion analyzer uses a notch filter to remove the fundamental of a signal and a power meter to measure the rest. This is a measurement of the signal’s residual components, which can also be analyzed in other ways to better understand the distortion.
The basic distortion analyzer approach uses a power meter and a switchable band-reject or notch filter. First, the full signal is measured to provide a power reference, and then the filter is switched in to remove the fundamental.
The signal that remains is a residual, containing distortion and noise, and can be measured with great sensitivity because it’s so much smaller than the full signal. That’s a big benefit of this technique, and why filters—including lowpass and highpass—are still used to improve the sensitivity and accuracy of signal measurements. Those basic distortion analyzers usually had a post-filter output that could be connected to an oscilloscope to see if the distortion could be further characterized.
To complete the rhyme, today’s digital demodulation measurements and quality metrics such as EVM or modulation error ratio (MER) are also residual measurements. Signal analyzers and VSAs first demodulate the incoming signal to recover the physical-layer data. They then use this data and fast math to generate a perfect version of the input signal. The perfect or reference signal is subtracted from the input signal to yield a residual, also called the error vector. This subtraction does the job that the notch filter did previously.
The residual can be summarized in simple terms such as EVM or MER. But if you want to understand the nature or cause of a problem and not just its magnitude, you can look at error vector time, spectrum, phase, etc. Here’s an example of measurements on a simple QPSK signal containing a spurious signal with power 36 dB lower.
A QPSK signal in blue contains a spurious signal 36 dB lower. The green trace is error vector spectrum, revealing the spur. A close look at a constellation point (upper left) shows repeating equal-amplitude errors that indicate that the spur is harmonically related to the modulation frequency.
Demodulation and subtraction remove the desirable part of the signal, providing more sensitivity and a tighter focus on distortion or interference. Because all these operations and displays are performed within the signal analyzer application or VSA, you need just one tool to help you understand both the magnitude and cause of problems.
At this point you may also be thinking that demodulation and subtraction could be a way to recover one signal deliberately hidden inside another. They can! I’ve experimented with that very interesting technique, and will explain more in a future post.
To make these explanations clearer, I’ve focused here on single-carrier modulation. These approaches to residual analysis work well for OFDM signals, and you can see examples at my previous posts The Right View Makes an Obscure Problem Obvious and A Different View Makes a Different Problem Obvious.