Skip navigation
All Places > Keysight Blogs > Better Measurements: The RF Test Blog > Blog > 2017 > October
2017

  While some are timeless, others evolve

This post will be brief, partly a consequence of the wildfires that have affected the Santa Rosa headquarters of Keysight and modified all our schedules, at least a little. Without getting too metaphysical, the fires and their aftermath are a powerful reminder that things are always changing.

This is certainly true in technology and test equipment. The need for better measurements never abates, and I’d like to say a thing or two about our cooperative efforts to keep pushing forward in applications (yours) and measurements (ours).

I’ve been reminded of the changing nature of measurement fundamentals in the context of my previous post on the innovations of the vector signal analyzer and Keysight’s RF test engineering webcast series on measurement fundamentals.

While some things are timeless—such as resolution bandwidth and power addition—others begin as advanced concepts and gradually become mainstream and even fundamental. Examples include ACPR/ACLR and error vector magnitude (EVM). Many of us can remember the first time someone explained channel power integration or vector error quantities to us, and yet eventually these measurement concepts are taken for granted in the context of more complex figures of merit. How about cumulative ACLR for non-contiguous aggregated carriers?

A similar phenomenon is evident in the core signal analyzer architecture described in the previous post. Vector signal analyzers began as a new type of analyzer, with a different architecture to preserve calibrated vector signal information. Eventually, this advanced architecture became common in lab-grade signal analyzers, and vector signal analysis transitioned to be an embedded software application for the analyzer platform.

The old/new duality is evident in the RF test webcast series mentioned above. The next installment deals with fundamentals.

Portion of Web page describing signal analysis fundamentals webcast, to be presented by Dave Engelder

Dave Engelder’s webcast will cover fundamentals, informed by his considerable involvement in new-generation analyzer architectures.

It’s a little ironic that Dave, the presenter for the fundamentals webcast, has spent a great deal of his time in recent years on product definition and planning for the newest generation of signal analyzers and their advanced capabilities.

Some fundamentals are timeless and unchanging, and others eventually give way to superior techniques and technologies. I suspect that dealing with an evolving mix of the timeless and the temporary is our fate as long as we work in RF.

  A new RF tool leveraged two evolving technologies

These days, vector signal analyzers (VSAs) are in broad use, especially in wireless, aerospace, and defense applications. They’re essential for designing the complex modulated and time-varying signals that have become ubiquitous.

However, VSAs haven’t been around nearly as long as spectrum or network analyzers, and I can remember the process—including an informal contest—that yielded the name “vector signal analyzer.” This month marks the 25th anniversary of the first VSA, and I’d like to take a brief look back.

Lots of technical forces were coming together as the 1980s came to a close, both in terms of signals and the equipment to test them. Mobile phones were entering a period of what would become explosive growth, and the transition from analog (1G) to digital modulation (2G) was the way to handle the increase in traffic.

In test equipment, signal processing and digitizers were improving rapidly, and some low-frequency signal analyzers had switched from analog filters to digital ones in their intermediate frequency (IF) stages. Technology and demand were both in place.

These forces converged in the single test and measurement division that had a deep background in both swept and FFT-based products. HP’s Lake Stevens Instrument Division had already produced the first low-frequency swept network and spectrum analyzers (e.g., up to 200 MHz) with digital IF sections. That put the division in a unique position to combine the classic superheterodyne architecture with high performance ADCs and DSP.

Resolution bandwidth filters could be all digital, with better speed, accuracy, and selectivity. Virtually any resolution bandwidth could be produced, from sub-hertz to several megahertz. Perhaps most significant, the entire signal chain could preserve signal phase and therefore vector content.

Processing a signal’s complete vector information was important for obvious and less-obvious reasons. Vector processing allows for accurate, selective analog demodulation, fully separating amplitude from phase or frequency modulation. It also provides complete pulse analysis, and the potential for digital demodulation.

A key decision in this area, driven by a need for accurate pulse analysis, was to perform continuous time-domain vector calibration across the analyzer signal chain. This improvement on frequency-domain calibration was later to be essential to precise digital demodulation of all kinds in the VSA.

Over a span of several years, all of this evolving and improving technology was subjected to extensive discussions and trials with potential customers. Their feedback was crucial to the definition and implementation of the first VSA and, in many ways, they taught us what a VSA should really be. Many changes and refinements were made along the way, and in October 1992 we introduced the first RF vector signal analyzer, the HP 89440A.

89440A RF vector signal analyzer catalog photo. Part of family including 89410A and eventually 89600 vector signal analyzer.

This image from the HP test and measurement catalog shows the first RF vector signal analyzer, the 89441A. The bottom section contains the RF receiver and companion source. The two-channel top section was also available as the (baseband) 89410A VSA.

Vector Signal Analyzer—I suppose the name seems obvious in retrospect, but it wasn’t so clear at the time. We were aware that it was a new type of analyzer, one we expected would be an enduring category, and one we wanted to get right. I can’t recall the other candidate names, but remember voting for VSA. After all, it was a signal analyzer—not just a spectrum analyzer—that provided vector results.

Wireless and aerospace/defense test engineers quickly grasped the possibilities. Shortly after introduction, we took the analyzer to its first trade show: engineers were lining up for demos. The signal views and insights provided by the frequency+time+modulation combination was compelling, and we were able to show waterfall and spectrogram displays, along with complete signal capture and playback.

Within the year we added digital demodulation, and the wireless revolution picked up steam. VSAs helped enable the new transmission schemes, from CDMA to high-order QAM, multi-carrier signals, OFDM, and MIMO. Software enhancements allowed the VSA to track the emerging technologies and standards, giving engineers a reliable test solution early in the design process.

Though the name and measurements would continue, the VSA as a separate analyzer type gradually yielded to newer “signal analyzers” with digital vector processing. These analyzers started with swept scalar spectrum analysis, and VSA capability became an option to the base hardware.

Now separate and embedded, Keysight’s 89600 VSA software continues the tradition of supporting the leading edge of wireless technology. The latest example: a new VSA software release supports pre-5G modulation analysis, and will evolve along with the standard.

It’s been a busy quarter century for all of us, and I expect VSAs will be just as useful for the next one.

  Even great minds fail sometimes, but we can revise our mental models

Though this blog is created by a company that makes hardware and software, the core of our work—and yours—is problem solving and optimization. In the past, I’ve said that success in this work demands creativity, judgement, and intuition. I’ve also written (several times) about my fascination with failures of intuition and ways we might understand and correct such failures.

One way to look at intuition in engineering is to see it as the result of mental models of a phenomenon or situation. Failures, then, can be corrected by finding the errors in these mental models.

We learn from those who go before us, and a great example of a failure (and its correction) by a famous engineer/scientist prompted me correct one of my own mental models. It was an error many of us share, and though I realized it was wrong a long time ago, it’s only now that I can clearly explain the defect.

The trailblazer is the famous rocket pioneer Robert Goddard. He was also a pioneer in our own field, more than a century ago inventing and patenting one of the first practical vacuum tubes for amplifying signals. If he hadn’t focused most of his energy on rockets, Goddard would probably be as famous in electronics as his contemporary Lee de Forest (but that’s a story for another day).

Goddard knew that stability would be a problem for his rockets, and he had no directional control system for them. To compensate, he used a cumbersome design that placed the combustion chamber and exhaust of his first liquid-fuel rocket at the top, with the heavy fuel and oxidizer tanks at the bottom. Having a rocket engine pointed at its own fuel is clearly inviting problems, but Goddard thought it was worth it.

He was mistaken, falling for what would later be described as the pendulum rocket fallacy. He was a brilliant engineer and experimenter, however, and corrected this erroneous mental model in rockets built just a couple of months later.

My own error in this area involved helicopters and their stability. Decades ago, I was surprised to learn that they are inherently very unstable. A friend—an engineer who flew model ’copters—gave a simplified explanation: The main mass of the helicopter may hang from the rotor disk, but when that disk tilts, it’s thrust vector tilts in the same direction. The lateral component of that vector causes more tilt and, unfortunately, also reduces the lift component. The process quickly runs away, and the helicopter seems to want to dive to the ground.

It’s similar to an inverted pendulum, and just the opposite of what my intuition would have predicted. It explains why pilots of non-stabilized helicopters describe the experience as balancing on the top of a pole: they must sense small accelerations and correct before motions build up.

While the explanation corrected and improved my mental model, my intuition was woefully unable to handle the claim that the aircraft below was relatively stable and easy to fly.

Hiller VZ-1 Pawnee flying platform in flight. The platform is similar to a helicopter, but rotors are in the form of a ducted fan at the bottom of the craft. Stability is generally better than a helicopter

The Hiller “Pawnee” flying platform of the 1950s used ducted fans beneath the engine and payload/pilot. Despite its appearance, it is easier to fly than a helicopter. (public domain image via Wikipedia)

This aircraft certainly does look like an inverted pendulum, though that perception is actually another fallacious mental model. 

One explanation comes from the powerful engineering analysis technique of imagining small increments of time and motion. If the flying platform tilts, the thrust vector of the fans tilts in the same direction, just as with the helicopter. However, in this instance the thrust is applied below the center of mass rather than above. The tendency to cause a tilt is applied in the opposite direction and is therefore not self-reinforcing.

I don’t believe the flying platform arrangement is inherently stable, but it is much less unstable than a helicopter. I once flew a flying platform simulator in a museum, and it was indeed straightforward to control.

So, let us acknowledge the power of our engineering intuition and salute our mental models as immensely useful guides. But let’s also remain vigilant in remembering that even the best engineers sometimes get things wrong, and it’s essential to be willing to make corrections.