KeysightOscilloscopes

Stop Wasting Time and Money by Struggling with Data Analytics While Designing T&M Experiments

Blog Post created by KeysightOscilloscopes Employee on Jul 25, 2017

Written by Ailee Grumbine and Brad Doerr

 

The design-to-manufacturing (D2M) process typically involves sequential stages from design to manufacturing. Each stage requires data collection that is specified in an initial design of experiments (DOE) and aimed at providing confidence that the design can meet critical requirements. Effective data analytics tools can help engineers evaluate the insights per the DOE in each stage of the design-to-manufacturing process. Time-to-market (TTM) can be greatly accelerated by utilizing modern data analytics tools while also increasing confidence in key technical decisions.

To hear more from the authors watch the video above

 

The first stages of the D2M process are design and simulation. The designer performs simulation to ensure that the design will meet the design specification. Simulation provides key statistics and produces waveforms that can be fed into compliance test applications. Simulation validation is a critical task prior to committing to expensive ASIC and PCB fabrication. The next stage is to perform design validation by using test equipment such as oscilloscopes and other measurement devices. The validation engineers will make measurements on multiple samples per the DOE created during the design stage. The DOE requires validation in a wide range of operating conditions, such as temperature and software configurations. The engineering team will then analyze the data with tools such as databases, PIVOT tables, JMP, R and/or other home-grown tools with data from instruments with data in CSV, XML or other formats. The challenge is that most engineering teams manage this data and the tools. This distracts from making measurements and promptly analyzing the findings. Next, the engineering team will perform compliance testing. Automated compliance test software saves a lot of time as it automates the measurements and produces the test report with statistical analysis to allow the engineers to determine the margins. This data is also very useful to determine if a second design cut is needed. Once the design is validated, the design can be released to manufacturing. The team will identify the production processes and measurements to ensure the design will meet the manufacturing goals derived from the original DOE. The manufacturing team will also seek efficiency improvements and/or yield improvements to improve. The data provides the basis for effective manufacturing management and optimization.

 

A capable data analytics platform integrates the DOE at the start of the process the engineering team will be able to achieve efficiency and confident decisions. The DOE is created in the early stages of design aimed at providing the data that can answer key questions about the design. This DOE defines the tests that need to be run in simulation and on the physical DUTs.  The DOE also identifies the test conditions and the number of tests that need to be run to achieve statistical confidence in the results. It is critical to choose a data analytics platform that can adapt alongside the DOE evolution. Nobody likes to delay a program while the team “re-architects the database schema”.            

 

There are many visualization tools in the market today that are used to help engineers analyze their test data. However, they are usually designed for a single user who has the time to acquire deep application expertise. These tools don’t fit well in the test and measurement D2M world especially as engineering teams are global and distributed. The visualization tool for D2M teams must provide data access to the entire team, with well-known visualization capabilities such as histogram, sweep, box-and-whisker and scatter plots.

 

Sweep or vector plots allow users to view 2-dimensional “sweep-data”. D2M and T&M applications rely heavily on sweep-data such as time-domain waveforms, frequency-domain magnitude plots and eye diagrams. The right analytics tool will enable the team to overlay for example, multiple eye diagrams with different test conditions. The overlay feature allows the user to determine test conditions that cause the eye to close or have less margin and allow the designer to optimize the design for best performance. Another example of a sweep/vector plot is a constellation diagram. Figure 1 shows an example of a 5G QAM4 constellation diagram. There are 3 sets of constellation data overlain which represent 3 different input voltages: 1V, 0.9V and 0.8V. The plot reveals that the constellation diagram with input voltage of 1V has the cleanest transmitted symbol. The constellation diagram with input voltage of 0.8V seems to be the one with the lowest received signal quality with potential phase noise issues.

Input voltages

Figure 1. Overlay of 3 different input voltages (1V, 0.9V and 0.8V) 5G QAM4 constellation data

Another visualization method in the test and measurement world is a box-and-whisker plot. Figure 2 shows an example of a box-and-whisker plot of a jitter measurement with multi-level split capability. The user can split on more than one property for analysis purposes. The plot on left is split by the three usernames: Sakata, Fernandez and Chang. The plot on right is split by username and input voltage. The plot indicated that most of Chang’s measurement values are higher than the upper limit especially for the input voltage of 0.8V.

Box-and-whisker plot

Figure 2. Box-and-Whisker plot of a jitter measurement with multi-level split capability.

In summary, successful D2M programs require a clear DOE and necessarily generate a great amount of data. With upfront planning and by choosing the right analytics platform, engineering teams can optimize effectiveness and time to market. This same data can also be leveraged into manufacturing ramp and manufacturing optimization.

 

Visit Keysight’s new Data Analytics Software here!

Outcomes