Skip navigation
All Places > Keysight Blogs > Insights Unlocked > Blog > 2017 > November
2017

It’s safe to say the application of data analytics to test & measurement (T&M) is lagging behind what’s happening in the financial, insurance and retail industries. However, as an optimist, I view this situation as a greenfield opportunity for you and for us. The ultimate benefits are making better decisions in less time, delivering the right product, and getting to market faster.

 

This is also a good time to pause and define what you and your organization aspire to in the use of data analytics (DA). Your goal may be as bold and transformational as overhauling your product lifecycle (albeit gradually). Or it may be as basic and personal as achieving tighter control over your projects—and, in all honesty, this is what I aspire to.

 

Addressing the fear

Another confession: While I have a deep appreciation for the benefits of DA, I dislike the term. The reason: “data analytics” can feel so overwhelming that it produces pangs of fear among many of my colleagues and some of our customers.

 

To narrow the scope, we can reframe it as “data analytics for test and measurement” (DA for T&M). And we can think small: you can use DA for T&M as a way to validate your design intent. Here, the goal is to gather the right data in the right amount to say, with confidence, “Yes, this product delivers on our design goals and we can build it reliably.”

 

Determining your team’s maturity level

Before diving in, it’s useful to step back and assess your team’s maturity level with DA. In this context, I see four levels: reflect, predict, prepare, and influence.

 

  • Reflect is a look backward at what has happened in the past, aiming to describe and understand the root causes of recurring problems and, if you’ve been collecting data, investigate any troublesome trends.
  • Predict builds on reflect by creating a working theory and then outlining scenarios for what is likely to happen internally and externally.
  • Prepare is the first step toward an active attitude: given a prediction of what is likely to happen, the organization takes steps to be ready for those situations.
  • Influence is a deeper engagement with the active attitude: if predict and prepare are pointing toward unfavorable scenarios, the organization takes steps to nudge the variables toward a more beneficial outcome. This is when we start aspiring to greater levels of control.

 

Taking your first steps

From my conversations with customers, most are like us: somewhere between “Where do we start?” and “How do we move beyond reflect?”

 

It isn’t about moving mountains, at least not at first. Start with the end in mind by defining the question you’re trying to answer. Then, carefully design the “experiments” you want to conduct and make a plan based on the type and amount of data you need to measure.

 

Next, be sure you have the tools to make the necessary measurements (actual values). Ideally, you will also be able to capture design simulations (predicted values) prior to building anything. Even if you can address only the validation phase—between design and manufacturing—this is still a worthwhile activity.

 

As a practical tactic, apply some up-front organization: Where should everyone store their data? What format should they use? Who will manage it? What common tool will they use for analysis?

 

Then get started by making measurements and collecting data from a batch of prototypes or pilot-run units, tested across a sufficiently broad range of properties and operating conditions (yes, this could yield several thousand data points). As you (and your team) are collecting data, sift through the initial raw data, apply “sanity checks” and, if everything looks good, start analyzing. This works best if your team has a common, shared tool that gives everyone access, enabling them to view data through their individual “lenses” and flag anything that seems out of kilter. By enabling your team to monitor progress together, you can adapt to insights and optimize your efforts and your results.

 

For project managers, I suggest using the early results to make a few less-than-critical decisions. Then, take stock: How much is it helping? Are we spending less time while making better decisions? What can we tweak to improve our DA process and results?

 

That’s how you start climbing the maturity curve.

 

Pushing through the fear

For me, brief moments of anxiety still occur when my manager asks, “What’s the basis of your decision?” No one has complete control of their project—but, thanks to DA, my angst is subsiding because I can confidently respond with a credible answer.

 

When you push through the fear, there are meaningful rewards on the other side. These include making better decisions in less time, delivering the right product, getting to market faster... and sleeping better (most of the time).

 

My next post will drill down to the next layer of the story, offering tips that will help you extract insights from your “data mess.” In the meantime, let’s discuss: Have you already started down this path? If so, what is or isn’t working? If you haven’t, what’s holding you back?

Data analytics is an emerging technology that is getting a lot of attention in the press these days. As with many exciting technologies, there is a mix of real opportunity surrounded by a lot of hype. While it is sometimes difficult to separate the two, I am among those who believe data analytics can make your business run better. As with any technology development, though, a positive outcome starts with a clear definition of the question you are trying to answer.

 

We can start with this one: Which tangible business results can come from data analytics? For most technology-intensive companies, one key driver is getting the right new product to market quickly and efficiently. The benefits are faster time-to-market, reduced risk and lower costs. In addition, topline results will improve when data analytics is used to optimize product plans and customer engagement.

 

Deloitte posted an article that suggests many companies are finding value in data analytics but, because these are the early days, there’s more insight yet to come. One early conclusion: the key benefit of analytics is “better decision-making based on data” (Figure 1).

Figure 1. Summary of findings from Deloitte’s Analytics Advantage Survey (pg 4, The Analytics Advantage: We’re just getting started)

 

Drowning in data, grasping for scalability

Companies that create electronic products are part of the overall trend toward data analytics. In a recent report, Frost and Sullivan sees growing demand for big data analytics applied in the test and measurement market. Keysight is part of this ecosystem, and our design and test solutions generate critical data from measurements of product performance.

 

We see many of our customers swimming in this data, and some are drowning in it. There are so many data sources that it is easy to accumulate a bunch of disjoint files that are poorly organized and difficult to manage.

 

This is typical, and it is why most large data analytics projects currently involve way more investment in data collection than in actual analysis. It is estimated that 80% of the effort goes into data collection and “data wrangling.” To me, “data wrangling” is the perfect phrase because it conjures up images of a cowboy tossing a rope around a spreadsheet in hopes of subduing it.

 

Many electronics firms have created homegrown solutions, ranging from simple collections of Excel files to complex software systems coded in C. Spreadsheet wrangling can work well for small, localized datasets—but it won’t scale up because data is isolated among individual users or test stations, perhaps spread across multiple sites. Revision control may be weak and it can be difficult to ensure that you have the most recent data. What’s worse, it usually turns into lost productivity as key technical staff spends time fiddling with all those spreadsheets. Over time, this maps to lost velocity towards finishing and shipping the product.

 

One alternative is reaching out to the IT department to create a more robust system. The resulting solution will be more scalable and supportable, but it also has internal costs. For one, changes fall to the IT team, robbing resources from other priorities. This is workable as long as all ongoing IT projects are fully supported and staffed.

 

Taking baby steps toward better data for better results

The actual analytics required can often be very basic. Sure, we’d like to turn on some machine-learning application that derives brilliant insight from our manufacturing test line and then feeds it back into the next design revision.

 

More likely, we are just trying to look at product performance in a consistent way so we can tell if the design is performing correctly. This is especially true in the prototype phase, when there are fewer devices to evaluate and the actual specification is still in flux. Later, in the manufacturing phase, we usually have plenty of data but it may still be siloed at the various production line stations or stored at another site, perhaps at a contract manufacturer.

 

Getting better at clarifying the target problem

As you apply the ideas discussed above, you will get better at defining the business problem you want to solve using in-hand design and test data. It may be improved time-to-market, lower R&D cost, better production yield, or something more specific to your operation. The next crucial step is creating or acquiring scalable tools that enable you to get your data under control.

 

My questions for you: Do you see this challenge in your business? What sources of data feel a bit disorganized or maybe completely out of control? Which tools have been most useful? We will be exploring these ideas in future blog posts, so stay tuned for additional insights.