Originally posted Feb 19, 2016
Borrowing stuff from phones and tablets that we helped make possible
Gary Glitter asked it first, but people my age probably remember Joan Jett’s version: “Do you wanna touch?” In the last few years, test and measurement companies have decided the answer is yes.
In combination with wireless networks, touchscreens have proven themselves in the realm of smartphones and tablets. I’ll admit that I was initially skeptical about instruments, however, based on my experience with laptops. For most regular laptop use, I couldn’t see the benefit of suspending my heavy arm in front of me to make imprecise selections of text or numbers, while my fingers partially blocked my view.
I guess it’s a matter of choosing the right application, and I think Apple should get a lot of credit for recognizing that, then creating and refining the touch UI. They’ve done this so thoroughly that it’s hard for me to see which gestures and other touch elements are inspired innovation and which are simply taking advantage of innate human wiring.
It seems clear that touchscreens for certain UI tasks are a win, so what about RF instruments?
The hardkey/softkey interface in analyzers has proven to be remarkably durable, but it’s about 35 years old. That’s ancient compared to everything happening behind the front panel. Sure, it was a great step up from analog switches, knobs and buttons as instruments evolved to digital control of measurement parameters in the 1970s. The hardkey/softkey interface leveraged the display paradigm of terminals and other computer displays, as analyzers gained computing power and signals—and corresponding measurements—became much more complex.
Instruments have continued to borrow UI schemes from the world of computers. Looking back about 15 years, I can still remember the first PC-based vector signal analyzer and the novelty of changing center frequency, span and display scaling by clicking and dragging a window with the mouse. On the other hand, clickable hot spots and adjusting parameters with the scroll wheel felt completely natural from the start. Just point to what you want to change and tweak it.
Touchscreens have been used on some RF signal analyzers in recent years, though the early ones were often the less-sensitive—and very slightly blurry—resistive types, and all the ones I’m familiar with were single-touch. Borrowing liberally from tablets and phones, Keysight has recently introduced a new line of X-Series signal analyzers with multi-touch screens that are nearly 70% larger than their touch-less predecessors, with nearly three times the screen area.
The new X-Series signal analyzers include much larger displays and a multi-touch UI. On-screen buttons and hot spots are sized for fingertips, providing direct access to measurement settings and bringing measurements within two touches. Two-finger gestures such as pinching or spreading can change display scaling or parameters such as frequency center and span.
Touchscreens are cumbersome unless the rest of the UI does its part with control features compatible with the size of a finger. In the X-Series the larger display was the first step. Some tasks are still easier to accomplish with hardkeys and a knob, and perhaps that will never change.
User menus have previously been available in signal analyzers, though they weren’t used very often—and perhaps it was because they weren’t easy to set up. Today, the ease of creating and accessing them in the X-Series analyzers with touch may change that. It’s a feature worth exploring if you’re working with the same group of settings over and over again.
I’ve written a lot before about making better measurements. A multi-touch UI, if done right, should be a way to instead make measurements better. Of course, whether they’re better for you is a matter of preference and the needs of your application. If you get the chance, give these new analyzers a try.