Bob Witte

Accelerate Your Digital Design with “Model, Measure & Know”

Blog Post created by Bob Witte Employee on Nov 2, 2016

The ability to accurately measure and quantify a digital design is essential to actually knowing what’s going on. A fellow named William Thomson, better known as Lord Kelvin, captured this concept in one of my favorite quotes:

 

When you can measure what you are speaking about, and express it in numbers, you know something about it.

 

This was simple back in the good old days. To measure a digital waveform, we would just connect an oscilloscope to the right node and take a look at the waveform. Oh, and we’d be sure the scope had enough bandwidth and the probe wasn’t loading the circuit or introducing distortion. We rarely, if ever, compared the results to a simulation. Mostly, we just checked to make sure the waveform looked “about right.”

 

Changing tactics in design and test

Today, the world’s insatiable demand for bandwidth continues to drive the need for ever-faster high-speed digital interfaces. As designers try to push more bits through the channel, they’re pushing the limits of what’s possible using the latest equalization and signaling techniques—decision feedback equalization (DFE), continuous-time linear equalization (CTLE), feed-forward equalization (FFE), PAM-4 (four-level logic), and more.

 

When characterizing the results, test equipment must often emulate those same techniques. For example, when physical transmitters and receivers are not yet available, an instrument has to mimic their respective behaviors at the input or output of the device under test (DUT). Even when the transmitter or receiver is available, it’s likely to be embedded on a chip. That makes it difficult to probe and measure—and, once again, the instrument must emulate either or both devices.

 

Addressing the problem: a real-world example

The process of creating accurate, realistic models is an iterative process. To ensure increasingly accurate models, the latest measured results must be fed back into the simulation system.

 

Although this process has many challenges, possible solutions are spelled out in a recent DesignCon paper on measuring PAM-4 signals at 56 Gb/s: PAM-4 Simulation to Measurement Validation with Commercially Available Software and Hardware. The DUT was a 3 m Quad Small Form-factor Pluggable Plus (QDFP) cable, driven by an arbitrary waveform generator (AWG) and measured using a high-bandwidth sampling oscilloscope (Figure 1).

 

Figure 1. Measurement of the DUT resides within a larger process that also includes simulation.

 

The channel configuration was first simulated in software using IBIS-AMI models for the transmitter and receiver. In this case, the transmitter was not available and the designer utilized an AWG to replicate in hardware the same transmitter waveform the simulator used. The simulator-provided transmitter waveform also included the FFE correction needed to open the eye at the receiver for CDR and data acquisition. [Aside: During early-stage development, you can use an AWG to emulate the absent transmitter using an industry-standard model.]

 

Similarly, to accurately measure the received signal, the oscilloscope executed a model of the not-yet-available receiver that included clock data recovery (CDR), CTLE and DFE. As above, the team used the same receiver model for design simulation.

 

Creating a new ecosystem—in your lab

Although the IBIS-AMI models have been developed and standardized by the electronic design automation (EDA) industry, they have also made their way into the measurement world. As described in the PAM-4 paper, connecting the physical and digital worlds creates a measurement/simulation ecosystem. As this ecosystem comes into alignment, simulated and measured results become increasingly well-correlated (Figure 2).

 

Figure 2. A tighter connection between simulation and measurement ensures closer correlation of results.

 

Mastering both realms, together, results in fewer design cycles and better predictability of design quality. In the PAM-4 example, appropriate application of the models ensures the ability to get a useful picture of the waveform at the output of the DUT, and from that gain better insight into how the receiver will decode it.

 

The age-old alternative to this beneficial ecosystem is the time-consuming “cut and try” approach that may never yield a reliable design. Worse than that, engineers are left to iterate their designs based on limited knowledge of system performance.

 

Going beyond “measure then know”

In reality, most teams include some engineers who are highly proficient with simulation tools and others who are deep into measurement tools. For the ecosystem to work, engineers must be able to apply tools from both worlds in a coherent manner. As teams learn, they feed new information back into the models and make them more accurate. Portions of those same, improved models can then be used to perform useful measurements.

 

This measurement/simulation ecosystem becomes “must have” if you are doing leading-edge digital design. Within this symbiotic ecosystem, Kelvin’s idea of “measure then know” expands to become “model, measure, and know.” And that’s when breakthroughs become more predictable.  

Outcomes