benz

Phase Noise and OFDM: Adding the Right Amount in the Right Place

Blog Post created by benz on Sep 20, 2016

Originally posted Dec 10, 2013

 

“No noise” is not always the right goal when you’re generating test signals

Minimizing noise is so often essential to better measurements that it’s easy to assume that maximum SNR is always the best test condition. In this post and one or two to come, I’ll discuss some examples and offer practical advice for situations in which some amount of noise—the precisely correct amount, of course!—will make your design or test task easier and your results more reliable.

As readers of this blog know, I’m generally not a fan of noise. It can represent disorder, entropy, poor performance, some degree of engineering failure, or maybe just bad luck. Excess noise instinctively feels like an affront to RF engineers and most of us would rather see perfectly pure sine waves or a well-constructed digitally-modulated signal, no matter how complex. Of course, many complex digitally-modulated signals look like band-limited noise, but that’s a sign of success.

Let’s focus on desirable noise in the form of controlled phase noise in test signals for orthogonal frequency-division multiplexed (OFDM) systems. OFDM is sensitive to phase noise because it causes the closely spaced subcarriers to interfere with each other and thereby reduce the orthogonality that is essential to the proper functioning of the system.

The obvious way to create OFDM test signals is to define them with near-zero phase noise; however, in the real world of user equipment this is neither realistic nor necessary. It isn’t realistic because creating signals with extremely low phase noise is expensive. It isn’t necessary because OFDM demodulators continuously track known “pilot” subcarriers and symbols in transmitted signals, and this tracking can compensate for some amount of phase noise. How much phase noise is OK? At what carrier offsets? Ah, that’s the signal generation test challenge, and the engineer’s chance to shine. The goal is to understand the maximum amount of allowable phase noise and to optimize design performance—and cost—by generating test signals that operate around this limit.

Thus, there are two major elements in testing to optimize phase noise performance in OFDM systems. First, generating signals with the appropriate amount and distribution of phase noise. Second, understanding how this phase noise will affect modulation quality measures such as error vector magnitude (EVM).

For the generation of OFDM signals with precisely impaired phase noise performance, the Agilent N5182B MXG X-Series signal generators use real-time baseband processing to provide a “phase noise injection” capability. The user need only specify a phase noise pedestal level and the frequency break points for the beginning and end of the pedestal. The figure below shows the relevant signal generator configuration screen with its target curve and the corresponding measurement by an Agilent X-Series signal analyzer with a phase noise measurement application.

The configuration screen for specifying added phase noise in the Agilent N5182B MXG signal generator is shown at left, including the anticipated phase noise curve. The corresponding measurement of actual phase noise by a signal analyzer measurement application is shown at right.

The configuration screen for specifying added phase noise in the Agilent N5182B MXG signal generator is shown at left, including the anticipated phase noise curve. The corresponding measurement of actual phase noise by a signal analyzer measurement application is shown at right.

As shown above, the RF signal generator neatly solves the problem of adding realistic phase noise at desired carrier offsets. The other major element of testing and system optimization is to verify the effect of this phase noise on modulation quality as the receiver sees it.

A common linear measurement of signal impairment is EVM, and for the purpose of optimizing phase noise it’s useful to assume the signal impairment is dominated by phase noise. Then it’s straightforward to use the rule-of-thumb that the pilot tracking effectively “tracks out” phase noise at offsets up to about 10 percent of the OFDM subcarrier spacing. That would be 31 kHz for the 312.5 kHz subcarrier spacing in this WLAN example.

EVM can then be estimated by integrating the single-sideband (SSB) phase noise power at offsets greater than 10 percent of the subcarrier spacing and less than the channel bandwidth, and adding 3 dB to convert SSB power to double sideband (DSB) or total power. In the display above, the power integration is performed by band-power markers in the measurement application and the result is -26.35 dBc when 3 dB is added to the -29.35 dBc marker reading.

The “10 percent of subcarrier spacing” rule-of-thumb is thought to be slightly conservative, and later measurement by a vector signal analyzer bore this out, with measured EVM just a fraction of 1 dB better than predicted.

The loop-closing process of generating impaired signals and verifying their impact on receivers is a powerful tool for optimizing OFDM systems and other complex designs. More detail on this approach to understanding phase noise and OFDM is available in an article I wrote last year entitled “Optimize OFDM Via Phase-Noise Injection.” You’ll find it in the October 2012 issue of Microwaves & RFmagazine.

Beware, however, that the article contains an error in the formula for calculating EVM from integrated phase noise. While the figures in the article’s examples are correct, my efforts to clarify the formula and its description went awry. I’m sorry about that(!) and here’s a better formula:

EVM (dB) ≈ [integrated SSB phase noise from 10% of subcarrier spacing to channel BW] + 3 (dB)

Outcomes