Todd Cutler

High-frequency, high-speed design revolution: Why today’s design and test flow may soon be obsolete

Blog Post created by Todd Cutler Employee on Oct 19, 2016

A tectonic shift is underway in how electronic products are being designed and tested. The change is being driven by customer requirements for product performance and the communications technologies needed to meet those requirements. To understand how dramatic this change is, it helps to have a little perspective. Here’s how I explained it in my keynote address at EDICON 2016.

 

Early in my career, in the 1980s, I was a field engineer in Florida. My best customer was Motorola, and their business in pagers and SMR technology was exploding. RF design was just moving from MS-DOS to Unix. Unprecedented processor speed, memory, and display systems supported the move from netlists to schematics, from linear s-parameter simulation to nonlinear design. I was lugging 50-pound monitors and 30-pound CPUs through the Florida heat to work onsite with Motorola engineers, and man, was it exciting. We were using new tools to design voltage-controlled oscillators (VCOs) that worked the first time. It was a huge breakthrough in productivity. We were on the cutting edge of a booming industry, building breakthrough products faster and at lower cost than ever before.

 

The design flow back then was linear: Measure components and create models; then use the models to design and simulate a circuit; then test the prototype. Each stage was separate and distinct. A transistor or inductor model was developed once and used in multiple types of designs. Simulators combined the models and predicted response. Verification was simple: did the VCO sweep over the right range, output the right power, and have good phase noise?

 

What amazes me is that today, three decades later, design flows are fundamentally unchanged.

 

Instead of pagers, we have complex handheld computers in our pockets. Our cell phones have more functionality, parts are packed together more closely, and more bands are supported, leading to complex signal routing. But design flows in 2016 have not fundamentally changed since we were developing pagers.

 

True, today’s components are measured and modeled with better fixturing and probing. Circuits are designed and simulated on vastly faster computers with better algorithms and solvers. EM simulation has become a standard part of the design flow, and thermal tools are emerging to optimize performance based on temperature. In prototype testing, more sophisticated measurements are made on complex OFDM waveforms, and these measurements are made faster and more accurately than ever before. But the workflow is the same: independent steps with limited interaction between each step.

 

That world is changing in front of our eyes due to four key drivers.

  •  Channel complexity. Instead of single channels, or even simple MIMO configurations, the industry is moving to massive phased array systems with 100s or even 1000s of elements. We see it in 5G and modern radar systems. To deal with such complex systems, measurements are moving much closer to design because even the best compute farms cannot fully model today’s complex systems. Instead, rapid-prototyping systems are starting to be deployed by manufacturers worldwide. 
  • Bandwidths. Instead of a few tens of MHz, proposed standards call for GHz bandwidths, creating even more processing challenges. On top of that, carrier frequencies have moved from RF and low microwave to well into mmW, creating new challenges in fixturing and even connector-less measurements.
  • Big data. Not long ago, 40 Gbit links failed due to lack of market needs. Now we are seeing systems deployed at 100GB with a mad rush to 400GB. Recently I was speaking to a researcher who needed to move 10 petabytes of data across the country for Big Data analysis. It’s an incredibly difficult problem to solve. This requirement for data naturally leads to larger, more complex networking equipment. Boards are more complex, too: it’s not uncommon to see boards that are 24x18 inches with dozens of layers. Simulation is rapidly improving, but compute farms don’t have the capacity for those designs.
  • IoT. I recently met with a base station design team that needs to evaluate radio performance with 50,000 devices simultaneously communicating. Since combining 50,000 separate signal generators is impractical, new methods are emerging that use design tools to generate signals from a single generator that appear to be coming from hundreds or possibly thousands of devices. In this case, the tables have turned. Instead of simulation capacity being limited, it is the measurement capacity that is limited.

 

These drivers require an inextricable connection between design and test. The old ways simply don’t work. Only by combining the best of design simulation and test can amazing new products and communications systems be created. Take massive MIMO systems as an example. Instead of designing the entire systems in simulation, a better workflow is:

  1. Architect the overall system at a behavioral level.
  2. Design a single channel, with perhaps near channel coupling included.
  3. Prototype the complete system.
  4. Measure it.

 

Using the measured results, the system-level design can be refined and improved. Prototyping will occur much earlier in the design cycle. Instead of measurements being done for validation, they become an integral part of the design process. This is more than theory. 5G advanced research labs around the world are using this exact approach.

 

Consider another example—designing a high-speed network card. These cards can have dozens of serial and parallel busses to transport data. Each channel is individually simulated and compliance tests verify performance against specs. Then the overall card is prototyped. The board is probed and measured and the exact same compliance test software used in simulation verifies performance against spec. Just as with the MIMO system, the overall high-speed system will be refined in simulation. Again, measurements are done earlier in the workflow and become an integral part of the design process.

 

I’ve been in this business a long time, and the changes we’re seeing now in communications systems will challenge our industry as never before. The way our industry designs and verifies new systems is changing quickly and for the better. It takes a new level of integration, a new way of thinking, and a realization that we’re entering a whole new era. I’m thrilled to be part of it.

 

Todd Cutler is Vice President of Design & Test Software for Keysight Technologies. Read his bio or view his keynote address at EDICON 2016.

Outcomes