Skip navigation
All Places > Keysight Blogs > Insights Unlocked > Blog > 2016 > November
2016

I hate trade shows. But this element of my role is both a curse and a blessing. Whether you call it a symposium or a circus, a convention or a carnival, events such as Globecom, EDI CON, European Microwave Week (EuMW), and Mobile World Congress mean long flights, jetlag, sore feet, and being subjected to the requisite barrage of wireless hype. But they are also an important and engaging part of staying in touch with the communications industry and the fascinating personalities therein. In the past few weeks I have added two of these events to my diary: Microwave Journal’s EDICON in Boston, and the Next Generation Mobile Networks (NGMN) Industry Conference and Exhibition in Frankfurt. With a focus on 5G wireless, here are a few observations and comments.

 

At this point, we really do know what 5G is.

Most 5G presentations still start with “nobody knows what 5G will really be” followed by the ubiquitous “the vision for 5G” summary. Consecutive speakers (me too) cannot resist the urge to show and describe that “5G Vision Slide.” There is beauty in consistency. There is also boredom. I am sure there are people in the world who have not seen such presentations, but by now that group is constrained to dairy farmers and rat-poison chemists.

 

We are gaining clarity on new “vertical businesses” enabled through 5G applications.

My reference to gaming in the now-famous Pokémon GO blog post was further clarified during the NGMN event. High data-rates, ubiquitous coverage, and low latency will enable opportunities for gaming and other emerging entertainment industries (and likely some that are not quite so innocuous).

 

There also seems to be more and more compelling arguments for the automotive industry to fully embrace wireless communications. While I still believe that a relatively conservative and highly regulated industry will take its time steering in this direction, the first clear lanes ahead will be navigation aids and mobile entertainment.

 

Open source is changing the game.

The NGMN event featured compelling sessions that examined open-source software. One was in the context of new business models, and another was a rather heated exchange around different approaches to intellectual property. The business models evolving around open source infiltrating network virtualization will be driving significant change in the industry. Giving away the code your software gurus struggle to generate while guzzling gallons of Mountain Dew and Rock Star may have once seemed anathema—in some circles, it now appears to be a requirement.

 

The industry is dead serious about mmWave.

EDI CON and NGMN featured plenty of discussions and exhibits regarding 5G mmWave in mobile communications. “Gee Roger, what about the other 25 5G shows so far this year that also had plenty of mmWave?” OK, I admit that this is nothing new, but the innovation featured at EDI CON (and immediately thereafter at EuMW) was then underscored by the MNO-focused NGMN event in which AT&T, SK Telecom, KT, and of course Verizon, highlighted their mmWave trials and plans.

 

My earliest posts stated that I do not see mobile multiple-access mmWave being commercial before 2022 or so. While I still believe this to be the case, these MNOs, who are the single most important entities to determine whether or not a new air-interface technology will be commercialized, are “all in”—and when they are successful, others will rapidly follow.

 

Wrapping up and looking ahead

One final comment: panel discussions are the most useful when panelists disagree. Now I am not advocating the circus of the recent U.S. Presidential debates, but folks (especially you moderators), we do not learn much when everyone smiles and nods in these discussions. More interesting to me have been a recent academic-versus-commercial showdown on massive MIMO and a dustup around the respective merits of open-source and royalty-based business models.

 

Here’s hoping there will be more such lively discussion at IWPC’s pair of meetings in November—featuring automotive wireless and (ahem!) mmWave in 5G—and then I’m off to Globecom. I look forward to providing some serious updates regarding our favorite 5G themes, and likely some flippant remarks about still more “5G Vision” and “5G KPI” slides as well as a few more “Kumbaya” panel discussions.

 

Industry gatherings: Love them? Dread them? Why?

The ability to accurately measure and quantify a digital design is essential to actually knowing what’s going on. A fellow named William Thomson, better known as Lord Kelvin, captured this concept in one of my favorite quotes:

 

When you can measure what you are speaking about, and express it in numbers, you know something about it.

 

This was simple back in the good old days. To measure a digital waveform, we would just connect an oscilloscope to the right node and take a look at the waveform. Oh, and we’d be sure the scope had enough bandwidth and the probe wasn’t loading the circuit or introducing distortion. We rarely, if ever, compared the results to a simulation. Mostly, we just checked to make sure the waveform looked “about right.”

 

Changing tactics in design and test

Today, the world’s insatiable demand for bandwidth continues to drive the need for ever-faster high-speed digital interfaces. As designers try to push more bits through the channel, they’re pushing the limits of what’s possible using the latest equalization and signaling techniques—decision feedback equalization (DFE), continuous-time linear equalization (CTLE), feed-forward equalization (FFE), PAM-4 (four-level logic), and more.

 

When characterizing the results, test equipment must often emulate those same techniques. For example, when physical transmitters and receivers are not yet available, an instrument has to mimic their respective behaviors at the input or output of the device under test (DUT). Even when the transmitter or receiver is available, it’s likely to be embedded on a chip. That makes it difficult to probe and measure—and, once again, the instrument must emulate either or both devices.

 

Addressing the problem: a real-world example

The process of creating accurate, realistic models is an iterative process. To ensure increasingly accurate models, the latest measured results must be fed back into the simulation system.

 

Although this process has many challenges, possible solutions are spelled out in a recent DesignCon paper on measuring PAM-4 signals at 56 Gb/s: PAM-4 Simulation to Measurement Validation with Commercially Available Software and Hardware. The DUT was a 3 m Quad Small Form-factor Pluggable Plus (QDFP) cable, driven by an arbitrary waveform generator (AWG) and measured using a high-bandwidth sampling oscilloscope (Figure 1).

 

Figure 1. Measurement of the DUT resides within a larger process that also includes simulation.

 

The channel configuration was first simulated in software using IBIS-AMI models for the transmitter and receiver. In this case, the transmitter was not available and the designer utilized an AWG to replicate in hardware the same transmitter waveform the simulator used. The simulator-provided transmitter waveform also included the FFE correction needed to open the eye at the receiver for CDR and data acquisition. [Aside: During early-stage development, you can use an AWG to emulate the absent transmitter using an industry-standard model.]

 

Similarly, to accurately measure the received signal, the oscilloscope executed a model of the not-yet-available receiver that included clock data recovery (CDR), CTLE and DFE. As above, the team used the same receiver model for design simulation.

 

Creating a new ecosystem—in your lab

Although the IBIS-AMI models have been developed and standardized by the electronic design automation (EDA) industry, they have also made their way into the measurement world. As described in the PAM-4 paper, connecting the physical and digital worlds creates a measurement/simulation ecosystem. As this ecosystem comes into alignment, simulated and measured results become increasingly well-correlated (Figure 2).

 

Figure 2. A tighter connection between simulation and measurement ensures closer correlation of results.

 

Mastering both realms, together, results in fewer design cycles and better predictability of design quality. In the PAM-4 example, appropriate application of the models ensures the ability to get a useful picture of the waveform at the output of the DUT, and from that gain better insight into how the receiver will decode it.

 

The age-old alternative to this beneficial ecosystem is the time-consuming “cut and try” approach that may never yield a reliable design. Worse than that, engineers are left to iterate their designs based on limited knowledge of system performance.

 

Going beyond “measure then know”

In reality, most teams include some engineers who are highly proficient with simulation tools and others who are deep into measurement tools. For the ecosystem to work, engineers must be able to apply tools from both worlds in a coherent manner. As teams learn, they feed new information back into the models and make them more accurate. Portions of those same, improved models can then be used to perform useful measurements.

 

This measurement/simulation ecosystem becomes “must have” if you are doing leading-edge digital design. Within this symbiotic ecosystem, Kelvin’s idea of “measure then know” expands to become “model, measure, and know.” And that’s when breakthroughs become more predictable.