Skip navigation
All Places > Keysight Blogs > Insights Unlocked > Blog

The emergence of 5G mobile communications is set to revolutionize everything we know about the design, testing, and operation of cellular systems. The industry goal to deploy 5G in the 2020 timeframe demands mmWave over-the-air (OTA) requirements and test solutions in little more than half the time taken to develop the basic 4G MIMO OTA test methods we have today.


If you remember anything from this blog post, know this:

"At mmWave frequencies, we are going to have to do all of our testing radiated, and not just some of it like we do today for LTE, and that's a BIG deal."



First, a bit of background on the move from cabled to radiated testing, and then I’ll discuss the three main areas of testing that we're going to have to deal with: RF test, demodulation test, and radio resource management (RRM).


Millimeter-wave devices with massive antenna arrays cannot be tested using cables because there will be no possibility to add connectors for every antenna element. The dynamic (active) nature of antenna arrays means it isn’t possible to extrapolate end-to-end performance from measurements of individual antenna elements. So yes, for testing 5G, it really is time to throw away the cables…whether we want to or not!

Keep calm because we are going over the air


Correctly Modelling the mmWave Channel is the Key to Designing a 5G System That Actually Works

5G millimeter wave new radio design modelling


A new radio design starts with the reality of the deployment environment, in this case a mmWave one. How this behaves isn’t a committee decision, rather it’s the laws of physics that are not up for debate! Next, we model the radio channel, and once we have a model, we can design a new radio specification to fit the model. Then, we design products to meet the new radio specifications, and finally we test those products against our starting assumptions in the model. If we have got it right—in other words, if the model is sufficiently overlapped with reality—then products that pass the tests should work when they are deployed in the real environment. That's the theory. While we know how to run this process at low frequencies, for mmWave, there is a big step up as the difference in the propagation conditions is enormous and our understanding for that is still growing.


Now let’s look at the scope of radio requirements that we're going to have to validate—that is, what we have to measure, and critically, the environment or channel in which we measure them.


The Scope of 5G mmWave OTA Testing

5G millimeter wave OTA test requirements

For RF, it’s about what is already familiar—power signal quality, sensitivity—and those are all measured in an ideal line-of-sight channel. With regards to demodulation, throughput tests will be done in non-ideal (faded) conditions as was the case for LTE MIMO OTA. There we had 2D spatial channels, but for mmWave, the requirement will be 3D spatial channels because the 2D assumptions at low frequencies are no longer accurate enough. In addition, we need to include spatial interference, since the omnidirectional interference assumed for LTE is no longer realistic at mmWave due to narrow bandwidths. Radio resource management (RRM) requirements are about signal acquisition and channel-state information (CSI) reporting, signal tracking, handover, etc. That environment is even more complicated because now we’ll have a dynamic multi-signal 3D environment unlike the static geometry we have for the majority of demodulation tests.


Balancing Out 5G mmWave Opportunities and Challenges

5G millimeter wave OTA test opportunities and challenges


The benefits of 5G and mmWave have been well publicized. There's a lot of spectrum that will allow higher network capacity and data rates, and we can exploit the spatial domain and get better efficiencies. However, testing all of this has to be done over the air and that presents a number of challenges that we have to solve if we're going to have satisfied 5G customers.

  • We know that we're going to have to use active antennas with narrow beams in user devices and base stations, and those are much harder to deal with than fixed antennas with wide beams.
  • We know that spatial tests are slower than cabled, so you can expect long test times.
  • We've got the whole issue of head, hand, body blocking on user devices—it's something that isn't being considered a priority for release-15 within 3GPP but will nevertheless impact customer experience.
  • We know that OTA testing requires large chambers and is expensive.
  • We know OTA accuracy is not as good as cabled testing—we're going to have to get used to that, particularly at mmWave frequencies where provisional uncertainties are above 6 dB.
  • Channel models for demodulation and RRM tests haven't been agreed upon yet, which is impacting agreement on baseline test methods for demodulation and RRM.



With 5G mobile communications, there's a paradigm shift going on because of mmWave. We used to work below 6 GHz and the question we asked was, "How good is my signal?" That question led to the development of non-spatial conducted requirements. The question now for mmWave is, "Where is my signal?" That's going to lead to the development of 3D spatial requirements which can only be validated using OTA testing. This is a fundamental shift in the industry that will impact the entire design and test flow.


It’s going to be a tall order testing 5G mmWave devices. In spite of the unknowns, Keysight is committed to getting our customers on the fastest path to 5G. Stay tuned as Keysight continues to roll out 5G testing methodologies and system solutions. Explore the 5G resources currently available.


This article is an adaptation of Moray's original post published in the Next Generation Wireless Communications Blog, where you can connect with our industry and solution experts as they share their experiences, opinions and measurement tips on a number of cellular and wireless design and test topics that matter to you.

Tubbs fire aftermath, Santa RosaWhen the Tubbs fire, dubbed the most destructive and costly in California history, swept through Santa Rosa, we at Keysight found ourselves on ground zero of the mandatory evacuation zone, rushing to assess impact to our headquarters and our 1500 employees and families. It tested our crisis management and leadership skills beyond what we could have prepared for, exposed the true nature of our values, and changed us in indelible ways. It also left us with a monumental challenge: how, and how fast, to rebuild. This is what we learned.


Preparedness is essential but only takes you so far.

A vetted and practiced crisis playbook proved indispensable; so did local and global crisis response teams pre-assigned to critical roles and ready to spring to action. But every crisis is unique and this one was massive. Families were evacuated in the middle of the night. Tens of thousands were displaced, nearly five thousand homes destroyed. We were left to balance the established process with the unexpected and dynamic nature of the fire, the blinding speed at which it unfolded, and the myriad related crises it created.


A strong leadership shadow drives action.

CEO Ron Nersesian at the help of the Santa Rosa fire crisesThe fire broke out at 10 p.m. and, fanned by 50 mile-per-hour winds, reached Santa Rosa by 1:30 a.m. With CEO Ron Nersesian out of the country, the rest of the executive team had to deploy crisis response in the middle of the night, making on-the-spot decisions the first 13 hours, while Nersesian jumped back on the first flight back. The team set up a command center away from the fires, directed immediate action and decided on employee aid and compensation. Our ability to take the helm during the crisis was enabled by the strong leadership shadow Nersesian had cast in his 3-year tenure as CEO.



It really does take a village.

#KeysightStrong banner put up to untie and encourage the community during the Santa Rosa crises

We expected the rest of our 145 sites around the world to focus on business continuity. The crisis, however, proved just too big for the executive and crisis team to handle by themselves. With unreliable cell and internet coverage, employees outside Santa Rosa organized phone trees and deployed multiple forms of communications to reach impacted colleagues.  A software team in Atlanta had an SMS text solution working within hours as well as a public website for matching requests with aid. The social media team in Colorado used its channels to help employees keep track of one another. Another team set up a charitable fund. Critical to the effort was our ability to use these extended teams across the company to solve problems that could be addressed remotely.


Business continuity is not business-as-usual.

Keysight customers are innovators who win or lose in the market based on being first and best and whose timelines don’t leave room for equipment delays. We had to put mitigation actions in place, from special customer outreach, to loaner equipment, to addressing predatory actions from competitors, keeping our customer-facing teams around the world on alert.


No task too small or far-fetched.

By day 3, the crisis team had hardly slept. There was no time to eat. The CMO and CFO shopped for supplies for the command center, including a pillow and two dog beds for a makeshift bed for the crisis commander – who hadn’t slept in 36 hours. Another leader offered to come to the crisis response lead’s home to “sit there and get food and water.” An R&D director researched and set up an external call center within 2 hours to take calls from the mounting number of employees needing assistance. Whatever was needed, whatever it took… we set aside roles and titles to do the right thing.


Rebuilding in the aftermath takes more than we think

...and longer than we expect.

Rebuilding the Keysight community after the Tubbs fire aftermath

Employees who had minutes to evacuate left their homes with only their families in tow and the clothes on their backs.  Rebuilding from a large-scale crisis takes time well-beyond when the last embers of the fires are extinguished, and requires much more than re-opening the company’s doors.


Address physical and emotional needs. The makeshift relief center sponsored by the company addressed basics like phone chargers, underwear and bottled water. But it also became a place people could come to connect, help, get support, and receive counseling – from professionals as well as colleagues who were themselves crisis survivors. We learned to start with the most basic, then build from there as we understood other needs.


Acknowledge the heart. We had to remember these were people’s homes, families, and friends affected. CEO Nersesian’s messages to employees emphasized people first, response and resources second. We learned we couldn’t take a fact-based, checklist-driven approach when people’s lives were intertwined with the crisis.


Offer respite. When personal life is in flux, work can become something to hold on to, a source of stability. Setting up temporary work spaces while the site was being cleaned up turned out to be a source of healing, as did photos of the beloved campus when it first restored power, and its hundreds of centuries-old trees that had survived.


Don’t rush. The reality is that we are still navigating the crisis, and may be for some time. While our culture is intact, our community must rebuild. As we move into this next phase, we’re figuring out day by day what that rebuilding looks like. It’s new homes for employees, new lives as children enroll in different schools, community gatherings to get stronger together. Now, as the fires come under control, we’re filtering decisions based on what our people, and the business, are ready for. Re-starting critical operations is a source of stability; delaying optional ones gives us breathing room to find our new normal.


We are not the same company or the same people we were before the fires. But we ARE stronger, more resilient, and more resourceful as a result of having lived through them. #KeysightStrong


There are two types of business assets: tangible and intangible. Your largest set of tangible assets is a major line item in your financials: plant, property and equipment (PPE), which includes test equipment. These days, the abundant data coming from that test equipment is among your largest intangible assets.


Such data delivers tremendous value because it can tell you what’s actually happening inside your operations—if you choose to listen to it. A few example use cases will illustrate what your data can tell you when data analytics (DA) converts it into actionable insights around key performance indicators (KPIs) such as yield, quality, throughput, utilization, and cost.


Exploring four specific examples

Many of our customers are in the early stages of applying data analytics for test and measurement (DA for T&M) to their operational data. As Bob Witte pointed out in his post, understanding your existing data helps improve overall knowledge of your business—and this enables you to define the key questions operational data can help you answer.


Among the Keysight customers who are actively climbing the maturity curve, many are applying DA for T&M in manufacturing. A majority of these projects align with one of four KPIs: warranty returns, mean time between failures (MTBF), test throughput, or quality and yield.


Reduce warranty returns

Let’s suppose one of your product lines is passing a finely tuned battery of tests in manufacturing; however, many are failing in the field and coming back as warranty returns. I suggest that you use DA for T&M to analyze data from every test in each step of your production process. Any outliers or problematic trends in the data will be visible, and you can correlate measurement results to every device by serial number. This will enable you to capture “walking wounded” devices—those that marginally pass or have been poorly reworked—before they can disappoint your customers.


Improve MTBF

As a regular practice, your staff may perform routine maintenance on crucial test equipment according to a fixed schedule. The upside: greater peace of mind. The downside: hours of instrument downtime, even though it is scheduled; and lost time for the people who are working on equipment that doesn’t actually need attention.


I would suggest a more efficient approach: Applying DA for T&M lets you shift from routine maintenance to preventive maintenance based on statistical, data-driven predictions of emerging issues or pending failures. This extends the mean time between failures and reduces downtime. It also leads to greater asset efficiency and utilization.


Accelerate test throughput

Looking across multiple lines that are manufacturing the same product, you may see significant variability in test times. Using DA for T&M, you can isolate those variations down to the exact test or measurement. This reveals actionable information about differences in test programs, and you can recommend changes that will optimize and accelerate specific tests or procedures. Taking this idea even further, one of our most advanced customers is improving throughput by applying basic machine-learning techniques to real-time data and making on-the-fly adjustments to test programs.


Improve quality and yield

Outsourced manufacturing adds complexity to many of your processes. DA for T&M opens the door to real-time process monitoring and control. For example, it can provide alerts based on variations in measurement data from specific components. This may reveal issues such as dual sourcing of components or accidental (or unauthorized) changes to test limits.


Taking the next step

Operating without DA for T&M is like driving in an unfamiliar city without a map app: you’ll eventually reach your destination, but you could have gotten there faster and with less frustration. Automated tools, dashboards and reports can guide you along the entire product lifecycle—and this applies to virtually every function, department and team within your operation.


Let’s discuss: What tools are you using? Which KPIs are you tracking? What sorts of improvements have you been able to achieve?

It’s safe to say the application of data analytics to test & measurement (T&M) is lagging behind what’s happening in the financial, insurance and retail industries. However, as an optimist, I view this situation as a greenfield opportunity for you and for us. The ultimate benefits are making better decisions in less time, delivering the right product, and getting to market faster.


This is also a good time to pause and define what you and your organization aspire to in the use of data analytics (DA). Your goal may be as bold and transformational as overhauling your product lifecycle (albeit gradually). Or it may be as basic and personal as achieving tighter control over your projects—and, in all honesty, this is what I aspire to.


Addressing the fear

Another confession: While I have a deep appreciation for the benefits of DA, I dislike the term. The reason: “data analytics” can feel so overwhelming that it produces pangs of fear among many of my colleagues and some of our customers.


To narrow the scope, we can reframe it as “data analytics for test and measurement” (DA for T&M). And we can think small: you can use DA for T&M as a way to validate your design intent. Here, the goal is to gather the right data in the right amount to say, with confidence, “Yes, this product delivers on our design goals and we can build it reliably.”


Determining your team’s maturity level

Before diving in, it’s useful to step back and assess your team’s maturity level with DA. In this context, I see four levels: reflect, predict, prepare, and influence.


  • Reflect is a look backward at what has happened in the past, aiming to describe and understand the root causes of recurring problems and, if you’ve been collecting data, investigate any troublesome trends.
  • Predict builds on reflect by creating a working theory and then outlining scenarios for what is likely to happen internally and externally.
  • Prepare is the first step toward an active attitude: given a prediction of what is likely to happen, the organization takes steps to be ready for those situations.
  • Influence is a deeper engagement with the active attitude: if predict and prepare are pointing toward unfavorable scenarios, the organization takes steps to nudge the variables toward a more beneficial outcome. This is when we start aspiring to greater levels of control.


Taking your first steps

From my conversations with customers, most are like us: somewhere between “Where do we start?” and “How do we move beyond reflect?”


It isn’t about moving mountains, at least not at first. Start with the end in mind by defining the question you’re trying to answer. Then, carefully design the “experiments” you want to conduct and make a plan based on the type and amount of data you need to measure.


Next, be sure you have the tools to make the necessary measurements (actual values). Ideally, you will also be able to capture design simulations (predicted values) prior to building anything. Even if you can address only the validation phase—between design and manufacturing—this is still a worthwhile activity.


As a practical tactic, apply some up-front organization: Where should everyone store their data? What format should they use? Who will manage it? What common tool will they use for analysis?


Then get started by making measurements and collecting data from a batch of prototypes or pilot-run units, tested across a sufficiently broad range of properties and operating conditions (yes, this could yield several thousand data points). As you (and your team) are collecting data, sift through the initial raw data, apply “sanity checks” and, if everything looks good, start analyzing. This works best if your team has a common, shared tool that gives everyone access, enabling them to view data through their individual “lenses” and flag anything that seems out of kilter. By enabling your team to monitor progress together, you can adapt to insights and optimize your efforts and your results.


For project managers, I suggest using the early results to make a few less-than-critical decisions. Then, take stock: How much is it helping? Are we spending less time while making better decisions? What can we tweak to improve our DA process and results?


That’s how you start climbing the maturity curve.


Pushing through the fear

For me, brief moments of anxiety still occur when my manager asks, “What’s the basis of your decision?” No one has complete control of their project—but, thanks to DA, my angst is subsiding because I can confidently respond with a credible answer.


When you push through the fear, there are meaningful rewards on the other side. These include making better decisions in less time, delivering the right product, getting to market faster... and sleeping better (most of the time).


My next post will drill down to the next layer of the story, offering tips that will help you extract insights from your “data mess.” In the meantime, let’s discuss: Have you already started down this path? If so, what is or isn’t working? If you haven’t, what’s holding you back?

Data analytics is an emerging technology that is getting a lot of attention in the press these days. As with many exciting technologies, there is a mix of real opportunity surrounded by a lot of hype. While it is sometimes difficult to separate the two, I am among those who believe data analytics can make your business run better. As with any technology development, though, a positive outcome starts with a clear definition of the question you are trying to answer.


We can start with this one: Which tangible business results can come from data analytics? For most technology-intensive companies, one key driver is getting the right new product to market quickly and efficiently. The benefits are faster time-to-market, reduced risk and lower costs. In addition, topline results will improve when data analytics is used to optimize product plans and customer engagement.


Deloitte posted an article that suggests many companies are finding value in data analytics but, because these are the early days, there’s more insight yet to come. One early conclusion: the key benefit of analytics is “better decision-making based on data” (Figure 1).

Figure 1. Summary of findings from Deloitte’s Analytics Advantage Survey (pg 4, The Analytics Advantage: We’re just getting started)


Drowning in data, grasping for scalability

Companies that create electronic products are part of the overall trend toward data analytics. In a recent report, Frost and Sullivan sees growing demand for big data analytics applied in the test and measurement market. Keysight is part of this ecosystem, and our design and test solutions generate critical data from measurements of product performance.


We see many of our customers swimming in this data, and some are drowning in it. There are so many data sources that it is easy to accumulate a bunch of disjoint files that are poorly organized and difficult to manage.


This is typical, and it is why most large data analytics projects currently involve way more investment in data collection than in actual analysis. It is estimated that 80% of the effort goes into data collection and “data wrangling.” To me, “data wrangling” is the perfect phrase because it conjures up images of a cowboy tossing a rope around a spreadsheet in hopes of subduing it.


Many electronics firms have created homegrown solutions, ranging from simple collections of Excel files to complex software systems coded in C. Spreadsheet wrangling can work well for small, localized datasets—but it won’t scale up because data is isolated among individual users or test stations, perhaps spread across multiple sites. Revision control may be weak and it can be difficult to ensure that you have the most recent data. What’s worse, it usually turns into lost productivity as key technical staff spends time fiddling with all those spreadsheets. Over time, this maps to lost velocity towards finishing and shipping the product.


One alternative is reaching out to the IT department to create a more robust system. The resulting solution will be more scalable and supportable, but it also has internal costs. For one, changes fall to the IT team, robbing resources from other priorities. This is workable as long as all ongoing IT projects are fully supported and staffed.


Taking baby steps toward better data for better results

The actual analytics required can often be very basic. Sure, we’d like to turn on some machine-learning application that derives brilliant insight from our manufacturing test line and then feeds it back into the next design revision.


More likely, we are just trying to look at product performance in a consistent way so we can tell if the design is performing correctly. This is especially true in the prototype phase, when there are fewer devices to evaluate and the actual specification is still in flux. Later, in the manufacturing phase, we usually have plenty of data but it may still be siloed at the various production line stations or stored at another site, perhaps at a contract manufacturer.


Getting better at clarifying the target problem

As you apply the ideas discussed above, you will get better at defining the business problem you want to solve using in-hand design and test data. It may be improved time-to-market, lower R&D cost, better production yield, or something more specific to your operation. The next crucial step is creating or acquiring scalable tools that enable you to get your data under control.


My questions for you: Do you see this challenge in your business? What sources of data feel a bit disorganized or maybe completely out of control? Which tools have been most useful? We will be exploring these ideas in future blog posts, so stay tuned for additional insights.

There was no celebration: in September, I attended my thirty-first public technical symposium in the guise of Keysight’s 5G program manager. From Tampa to Tel Aviv to Taipei, whether organized by IEEE, NTIA, GSMA, or IMT2020, such events have taught me a few things about myself and many more about how our 5G technology community manages social gatherings.


Please allow me a quick analysis. I sort these events into three buckets: “technically rich,” “overtly commercial,” and “government promotional.” Here are my snapshots of each type.


Technically rich: By turns exhausting or invigorating

I like technical, but these are simply not enjoyable when they drift into academic opacity. Still, many provide opportunities for rich dialogue with others in the industry. Three events come to mind: IEEE MTT/IMS; the IWPC meetings; and the recent International Symposium on Advanced Radio Technologies (ISART) convocation in Colorado. I exit these with a rejuvenated curiosity and a refreshed perspective about the amazing technical brains powering the communications business.


ISART was an especially impressive mix of policy makers, mobile communications experts, and satellite industry representatives. I learned a great deal about millimeter-wave (the primary topic) and gained insight into how some institutions work. I also picked up a few tidbits on spectrum policy:


  • News to me, part 1: The ITU is part of the United Nations, and ITU spectrum decisions are international treaties. Among UN organizations, the ITU is unique in allowing the participation of commercial entities.
  • News to me, part 2: The FCC, which gets a lot of press, and the NTIA, which gets very little, are sister organizations. The former manages (among other things) spectrum for commercial use; the latter manages spectrum for federal use.
  • The real story: I had assumed that the spectrum conflict between mobile and satellite was strictly technical, centered on the risk of interference. Not so: the most recent Upper Microwave Flexible Use Service (UMFUS) report and order (R&O) from July 2016 is the source of discord because the FCC wants to reduce the risk of interference by placing tight restrictions on the location of large ground-based satellite gateways (i.e., terrestrial links to satellite constellations). Jennifer Manner of EchoStar suggested these rules are not even practical.


Overtly commercial: Have you read our press release?

GSMA’s Mobile World Congress in Barcelona is the most prominent example. While I have had excellent discussions with some of our key customers at this event, I sheepishly admit that MWC has additional appeal because it is a great excuse to explore and enjoy one of my favorite cities. On the other side of the Atlantic, the GSMA teamed up with CTIA for the first time to create Mobile World Congress Americas 2017. Although I would love to write about the event, I was not there; rather, I was in Taiwan, attending the third kind of gathering...


Government promo shows: Not purely self-promotion

The 4th Taipei 5G Summit was a two-day event organized by a group within the Taiwan Ministry of Economic Affairs. It was coupled with the 21st World Conference on Information Technology. I had the honor of speaking at this event and focused my talk on getting the audience to think about a sampling of measurement and validation challenges in 5G New Radio (NR).


Taiwan is an interesting case for 5G communications in that its indigenous mobile operators will be very cautious about investing in 5G—a reluctance driven particularly by the failure of Taiwan’s WiMAX business model. However, Taiwan-based multinational technology giants like Hon Hai, TSMC, WNC, Quanta, and Pegatron will all take full advantage of the global investments in 5G technologies. Based on what I saw from both National Taiwan University and National Chiao Tung University, it is clear that academia is also fully engaged in a very impressive manner.


Among the many highlights from the event was a presentation by Tareq Amin of Reliance Jio. Mr. Amin deftly detailed how Jio completely changed its technology investment paradigm to implement a financially stable LTE network in India, a country with an ARPU of about $2 (vs. about $60 in the USA). Ordinarily, I resent sitting through presentations that are thinly veiled sales pitches (the Taiwan Summit had a few of those). Mr. Amin did indeed talk about Jio’s success: it achieved #1 LTE penetration in the world in seven months, processing some 7 petabytes per month. However, his real message was about innovation that follows from a drastic change in perspective when confronted with unprecedented boundary conditions. It was the most inspiring talk I have heard perhaps all year.


What have been your experiences?

Here I expose myself to comments from those of you whom have had to listen to my talks. What inspires you at these events? Will 5G be successful in your environment? What will it take?

When Google employee James Damore sent out an internal memo questioning his company’s diversity policies, he likely didn’t expect that it would go viral. It did. In its wake, a firestorm of discussion emerged in Silicon Valley and beyond—one that still rages today. Whether you agree with James’ assessment of gender bias and diversity in the workplace or not, one thing we can all agree on is that it brought renewed focus to a timely topic, one that should be just as important to men as it is to women.

It’s a topic I’ve come to know well during the span of my 35-year career in the tech industry, and one that I recently had a chance to reflect on during an interview I gave to Society for Women Engineers (SWE). During that time, my personal journey has taken me from a man not aware of my unconscious bias extended toward women, to one who now serves as their full-on diversity partner. It’s a journey I’m hoping more men will take.


Becoming an advocate for women

For me, that journey began back in the 90’s. I was a young manager at the time, fairly new to management in fact. I had been trying to promote women in my organization and because of my efforts, received an invitation to the HP Technical Women’s Conference in Silicon Valley. I was one of only two men in attendance.

During the event, I’d sit in a room with what seemed like 5,000 women and listen to speakers talk about unconscious bias. I watched in amazement as every woman in the room universally agreed that it was a problem. I had never thought about it before—how my unconscious bias toward women could actually be contributing to the problem. That idea and many of the other issues the speakers spoke about weren’t even on my radar. It was a wake-up call.

That realization is just as relevant today as it was back in the 90’s. Consider a 2012 study (Moss-Racusin et al., 2012) designed to examine how applicants were evaluated for a lab manager position. Faculty were presented identical applications for the position, the only difference being that candidates were given obvious male or female names. Despite having the same qualifications, both male and female faculty routinely rated female student candidates lower than their male counterparts. The female applicants were seen as less hirable and offered lower salaries. The study revealed what many of us already know—there’s a clear unconscious bias against women in the workplace.

What I learned about unconscious bias from that conference I attended all those years ago stayed with me throughout my career. It helped transform me into an active and full-on diversity partner with women. What’s more, I make it my goal to ensure other managers and emerging leaders can attend similar conferences in the hopes that they too will have the same experience and transformation. 


Turning awareness into action

The experience alone is not enough. Being aware of unconscious bias is certainly the first step in helping to overcome it, but for that awareness to make any real difference it has to lead to tangible action. In other words, men have to become allies with and advocates for women, whether in engineering or any other profession.

What can you do to be an ally for women in the workplace?


1.  Uncover your hidden biases about gender diversity in the workplace.

You don’t have to attend a conference like I did to find out if you have an unconscious bias toward women. All you need to do is take a quick and easy online Implicit Association Test (IAT), such as the Gender-Career or Gender-Science IAT.


2. Watch a video on techniques for ending sexism in the workplace.

In just a short 10-minute time span, you can watch a video that will tell you about 5 easy things you can do today to promote diversity in the workplace. Watch “5 Ways Men Can Help End Sexism”.


3. Start you own personal diversity action plan.

A diversity action plan is a written plan of the actions you personally intend to take to promote gender equity and diversity in the workplace. My plan involves a number of different things. As a manager, I am vigilant in assigning new projects or tasks based on skill, but also with a mindset toward providing growth opportunities for women engineers. I look for ways to help them grow and gain higher visibility.


I also actively go through my data on employee pay equity once a year. I examine job category, employee gender, where they work, how much they make, etc. to make sure that all employees are paid equitably and that we have a good female-male split. If adjustments need to be made, we determine how best to do that. And, I try and make sure that all engineers, women and men, are set for success with good work/life balance.

One of the great tools we have to promote work/life balance for our employees at Keysight is the Hidden Valley Elementary K-3 school, built on Keysight’s land in Santa Rosa, California. Parents at the Santa Rosa site have the flexibility to join their kids for lunch, participate in school activities, or even help out in the classroom during the day. It’s been key in ensuring our engineers feel that they can have both a family and career.


Another critical part of my personal action plan is serving as a member of the Keysight - Society of Women Engineers Enterprise Program (KSWEEP). KSWEEP is an organization within Keysight designed to support the efforts of the Society for Women Engineers (SWE). We do this by sponsoring employees to attend SWE conferences and providing communities, networking, professional development and outreach opportunities throughout the year. We also sponsor local activities and recently helped SWE expand its global footprint into Penang, Malaysia.


And, because I feel strongly that it’s the responsibility of male senior and executive leaders to help the next generation of male engineers learn about gender equality and diversity issues, I’ve made it my goal to find ways to actively encourage those leaders to take up that role and become part of the solution. One solution I found was to bring one of my top male engineering managers to last year’s SWE conference.


Those actions are all part for my personal action plan. Your plan might look quite different. You could choose to:

  • Talk with your female colleagues and really listen to what they might convey and offer about their gender-related experiences in the workplace. Seek opportunities where you gain such access.
  • Ensure female colleagues have equal time to speak at meetings and be sure to share information equally with female and male colleagues. Minimize mansplaining.
  • Make sure female colleagues get included and invited to informal work gatherings.
  • Demonstrate your commitment to gender equity to your colleagues. Take corrective action when you notice gender inequities and bias. Your silence otherwise makes it acceptable.
  • Nominate women for awards, honors and work positions, when and where applicable.
  • Start a work committee designed to encourage men too to act as an ally for gender equity in the workplace. If one already exists, volunteer to help.
  • Recognize that women often have disproportionate responsibilities for child and elderly parent care. Turn that awareness into action by supporting a work/life balance and doing things like planning meetings with consideration for your teams personal/family schedules.

Find your voice

Whatever you chose to include in your personal action plan, the point is to do something. Don’t be part of the problem, be part of the solution. It’s a journey we can all take together, and one that promises to positively transform not only the workplace, but society in general.


You can start on your own personal journey by sharing this blog post with your colleagues. Use it to open up a discussion on gender bias and diversity. As I did all those years ago, you may just find yourself really seeing the issue for the first time, and deciding to become an active diversity partner for women in your workplace.

Some businesses view instrument calibration as a best practice, others view it as a compliance issue driven by industry standards, contractual agreements, or certification requirements. In all cases, there’s a decision to be made when it comes to instrument calibration. Does it make more sense to do it yourself or outsource it? I’ve discussed the pros and cons with manufacturing executives around the world, and while there’s a case to be made for both sides, I see six strong arguments for outsourcing equipment calibration services.   


1. Faster turn around

While an internal calibration lab can sometimes move faster than an outsource team, I find that outsourcing consistently gets faster results, especially when you do a true apples-to-apples comparison. For example, one of our large aerospace/defense customers had an in-house calibration lab that had a two-week service-level agreement (SLA) with their internal customers for all calibration services. The internal calibration lab appeared to meet their SLA consistently. But on deeper investigation, it was discovered that only the simpler tools, such as torque wrenches, were being calibrated within the two-week window. More sophisticated instruments that measured RF and microwave systems were actually taking four to six weeks for lab calibration. By outsourcing, the customer was able to reduce their SLA to just one week for all instrument calibration services, even for the most complex instruments.


2. Lower OPEX

 In the example above, when a complex instrument required four to six weeks for calibration, the internal calibration lab was forced to use spares and rentals to meet the two-week window specified in their SLA. Spares and rentals cost them millions of dollars per year. By outsourcing, they not only reduced their turn-around time by 50 percent, but also reduced their annual operating expenses by about 20 percent by completely eliminating the cost of rentals, storage space for spares, and the man-hours required to procure, manage, and maintain the excess inventory. I’ve seen the same scenario play out for other customers as well. In my experience, an OPEX savings of 10 to 25 percent is common when customers shift from in-house to outsourced instrument calibration services and compare all costs, not just specific calibration costs.


3. Lower CAPEX

 Some companies excel at in-house calibration, but for most, it’s a departure from their core business. Dedicated outsource vendors, on the other hand, provide calibration services as a core competency, and make a substantial capital investment to deliver those services. Whereas budget limitations might force an internal calibration lab to use older, slower hardware at the edge of its lifecycle, a reputable calibration vendor will have the latest hardware and software and dedicated teams that know how to use it. This calibration-as-a-service business model removes CAPEX impacts as well as lifecycle concerns for manufacturers, and provides access to state-of-the-art calibration capabilities at all times.


4. Future-proofed test

 Competitive manufacturers should be free to chase new opportunities without worrying about lab calibration capabilities or having to spend capital to upgrade a calibration lab as the business evolves. For example, many wireless manufacturers are now making the leap from 4G to 5G wireless, which means a lot of next-generation test equipment is being introduced into design and manufacturing environments. Older calibration hardware and software isn’t designed for the higher frequency ranges of 5G, but in an outsourcing model, that’s not the manufacturer’s problem. The calibration vendor is on the hook to stay ahead of 5G test requirements. It’s a CAPEX benefit for manufacturers since they don’t have to purchase and learn new calibration equipment. More importantly, it’s a competitive advantage since they can invest their energy and resources in being first to market rather than trying to be calibration experts.


5. Improved audit compliance

 Since most manufacturers now have global supply chains, it’s common for test equipment to be calibrated in one location but used in another. This sometimes makes it difficult to find records and documentation for ISO audits and other record-keeping functions. A reputable outsource partner will maintain records in a centralized database as part of the SLA between the outsource calibration lab and the customer. The records are available 24/7 regardless of where the instrument calibration services are performed, so compliance can always be proven and audits are less of a headache.


6. Scalable resourcing

 With natural attrition of an aging workforce, it can be hard to keep qualified calibration engineers on staff. And in many companies, engineers who retire are not replaced. Even if a new job requisition is opened, it’s nearly impossible to replace the specialized skills needed in a calibration lab. By comparison, outsource vendors have a pool of dedicated resources who are trained on the latest calibration equipment. Manufacturers have access to a scalable resource pool without having to staff up or down based on business conditions.


I’ve had many conversations with calibration lab managers and manufacturing executives over the years, and I know that outsourcing is not for everyone. If your product line is stable and not evolving, if your SLAs are truly being met, or if CAPEX is not an issue, then an internal calibration lab is probably fine. But if emerging technologies or new business opportunities are creating a need for capital investments to upgrade your internal calibration lab, you should do a financial audit to determine the true cost of outsourcing versus keeping it in house. Make sure you take into account not only hardware, floor space, and SLAs but also the ongoing expertise required to write automated calibration procedures to get the turnaround you need. If you do decide to outsource, it pays to do your homework on the quality of the calibrations you’ll be receiving. Not all calibration partners are created equal. Choose one that knows its stuff, and good luck.


Eric Taylor is Vice Present & General Manager of Managed Services Division – West. Read his bio at

5G has been picking up speed faster than a downhill racer—and may seem similarly on the hairy edge of control. From Mobile World Congress (MWC) in Barcelona to the latest 3GPP meetings, the buzz is growing louder around topics ranging from fixed wireless to “what’s our next business model?” With apologies for going dark since late March, the following Top Five Hot Topics update is aimed at keeping you abreast of the latest.


1: Fixed-wireless systems are the first step to millimeter-wave 5G

Even with NEMs and device makers anxious to be part of the 2018 winter showcase in South Korea, the first highly visible commercialization will be Verizon’s fixed-wireless system. For two years, VZW has been stalwart in its intent to commercialize in 2017 using its “pre-5G” 5G-TF specification. However, this spec is unique and has some fundamental differences to 3GPP’s New Radio (NR), and orphaned technology is a very real risk. On the plus side, VZW is gaining a first-mover advantage by learning how to enable commercial access without introducing the complexity of mobility.


In response, AT&T has also stated it will deploy a 5G mmWave fixed service, one that will be based on NR. This is why AT&T pushed very hard for the acceleration of the non-standalone (NSA) version of NR.


2: 3GPP accelerating NSA enables NR to happen before R15

“Before R15” means late 2017. With NSA, the 5G radio access network (RAN) will be controlled by the LTE core and hence LTE can be used to set up calls, manage initial access, and even provide fallback capability if the 5G RAN link fails.


The combined effect of Verizon’s decisions and Korea Telecom’s push to commercialize 5G in 2019 drove the industry to ultimately agree to the acceleration. The resulting commercial play will occur somewhat sooner but, in some ways, will be more risky than originally planned.


3: Greenfield spectrum between 2.8 and 5.0 GHz outside the US

5G wireless technology is not just millimeter-wave. Outside the US there is open spectrum between 2.8 and 5.0 GHz that is getting serious attention from major operators in China, Japan, South Korea and several European countries.


Because this is ideal territory for 5G Massive MIMO, much of this will occur in TDD spectrum. It is not clear how, when, or whether, operators will reallocate 3G and 4G spectrum to NR—but the territory between 400 MHz and 3 GHz may undergo a change in the middle or late 2020s based on the relative success of NR in terms of its spectral and energy efficiency.


4: Rapid virtualization of networks will drive flexibility

In April 2016, AT&T announced that its evolved packet core (EPC) was all virtual. Demos at MWC suggested that virtualizing the RAN could reach all the way down to the upper MAC layer. This means network equipment becomes computers, wires (or fiber), and remote radio heads—likely with enough on-board ASIC power to manage the lower MAC, Layer 1, and PHY.


This can enable software-defined networks (SDN), self-organizing networks (SONs) and, further, network slicing. This last item had many compelling demos at MWC and enables operators to create, perhaps in real-time, network slices manifest as software entities. These have different KPIs, depending on what each slice will do for its associated client and business model. For example, a high-reliability slice could take network resources from other slices in order to meet service level agreement (SLA) criteria. While this will prompt serious looks at new business models, it appears to be a case of technology moving faster than policy because one could suggest this is a step towards violation of net neutrality (in those places where it still exists...).


5: Business cases for 5G wireless investment are still under scrutiny

There is still much skepticism about where the money will come from. Debates continue to rage about why any operator would significantly invest simply to develop better capability for a client base with declining average revenue per user (ARPU). Yet, new concepts abound: 

  • Third party pays: This is a big one that could expand mobile advertising (e.g., Google). One can envision combining this with entertainment schemes that could function only on a fifth-generation network (think “PokemonGo! 2020”).
  • Automotive/connected car: There are numerous ideas here, and many are associated with third party pays—especially those related to infotainment and entertainment. Most of the hype is related to the facilitation of real-time navigation and automated driving; however, I believe this will happen more slowly than ad- or entertainment-related applications due to policy, technology, and sociological issues.
  • Industrial or municipal: IoT can save billions for municipalities and commercial entities that need to collect data remotely on a real-time basis. This could enable machine learning to manage things even as mundane as parking and garbage management, or open the door to serious data analytics. 

Those are my Top Five Hot Topics. What other trends or developments are you seeing?


In my first post, I described how counterfeits can be found anywhere and that Keysight can no longer ignore this fact. Our customers know this too, which is why they pressure their suppliers to put systems in place to avoid the introduction of any suspect material into their supply chains. Keysight solved this customer challenge by working with one of our most demanding customers—the Aerospace Defense Industry. By working together in a transparent manner, we ended up developing a government-grade counterfeit avoidance and response system to meet the US Department of Defense’s exacting DFARS (Defense Federal Acquisition Regulation Supplement) requirements.


  1. Ensure traceability. It’s imperative to be able to trace a product back as far as possible to the raw material. Lack of traceability is a huge risk indicator. Electronic components typically use date or lot codes, and if available, certificates of conformance.


  1. Buy from authorized channels. This may seem like a no-brainer, but some deals appear too good to pass up. And if you find yourself in a line-down situation, you may find yourself considering the grey market. Buying direct from the original component manufacturer or from one of their franchised distributors is the best upfront mechanism to vet suspect material. Once you move outside of these spaces, you enter the Open Market, which is fraught with risk and bad actors looking to make a quick buck. The best thing you can do is “know your supplier.”


  1. Put a response system in place.What data must be collected and who needs to be informed if a suspected counterfeit problem pops up? This response system must address internal concerns (failure in a Keysight instrument; evidence of rogue procurement from a bad actor), as well as industry alerts (GIDEP alert notifying industry that counterfeits have been found). Ownership of the issue from beginning to end is critical.


  1. Qualify a few reputable independent distributors. In the electronics world, independent distributors play a unique role in helping to locate and test for legitimate material. Strong independents differentiate themselves from pure brokers with membership to electronics organizations such as ERAI. They are typically set up with labs to perform robust visual and often electrical inspections. Force your purchasing arm to use only these qualified independents when facing a shortage. Keysight actually allowed one of our most demanding Aerospace Defense customers to audit our approved independents.


  1. Design for supply chain. This should go without saying, but do not design a product with components that are nearing the end of their lifecycle. This leads to large lifetime buys and exposure to the grey market. Along similar lines, consider buffering inventory for strategic sole-sourced material.


  1. Flow requirements down through your supply chain. You can add multiple layers of security by flowing counterfeit avoidance and response requirements down through your supply chain, whether to a contract manufacturer, semiconductor manufacturer, vineyard, or stockyard. Don’t assume anything. Trust but verify.


  1. Consider a tabletop exercise and be willing to go through a customer audit. Keysight performed an extensive internal walkthrough process and then volunteered to go through a three-day customer audit of our counterfeit avoidance and response system. This was followed by visits to two of our Independent distributors. This process demonstrated to our customer that we were serious about keeping the supply chain clean and serious about caring for our customers.


Although these practices were developed for the demanding Aerospace Defense industry, they can apply to a range of industries, from food to wine to pharmaceuticals to clothing and many more. Most industries deal with counterfeits in one way or another, and every competitive company on the planet takes its reputation and its relationship with customers seriously. With apologies to Groucho, that’s something you can’t afford to fake.

Our small town recently held an electronics recycling day to help families get rid of unwanted printers, laptops, TVs, and other devices. Just bring them to the local school and the devices get recycled for free. Great idea, right? Clean out your garage and promote sustainability at the same time!


Reduce and “re-use” is the prevailing term today, whether for used electronics or grocery bags. However, there’s a dark side to these electronics recycling events with unintended consequences that affect manufacturers and consumers around the globe. Today’s news is full of stories about used and counterfeit components finding their way into supposedly new, state-of-the-art products. In fact, counterfeit goods are everywhere these days, accounting for nearly half a trillion dollars in sales each year, or nearly 2.5 percent of all global imports. Electronic components are the single biggest piece of the counterfeit pie, generating billions of dollars a year for ethically challenged parts suppliers. It’s a pervasive problem with serious implications for original equipment manufacturers and their customers.  Here are the top four impacts of counterfeits.


End-user injuries

If you buy a Rolex watch on a street corner for $25, nobody gets hurt. But if you get an incredible deal on a hoverboard, don’t give it to your kid and definitely don’t try bringing it on an airplane. Last year, over 16,000 counterfeit hoverboards were seized by customs officials in Chicago after a well-documented series of fires injured people using the two-wheeled skateboards. In other news, a report by the U.S. Senate Armed Services Committee was profiled in Forbes a few years ago, citing the presence of counterfeit electronic components in military and commercial aircraft. As the author of the article states, how comfortable would you be on your next commercial flight if you suspected that a key onboard system may contain suspected counterfeit parts? Thankfully airplanes have multiple layers of redundancy, but that doesn’t change the fact that real people and real lives are affected by counterfeit components in our increasingly electronics-centric world.


Critical mission failure

No matter what you make as a manufacturer, mission failure is always bad. However, in the tech-laden aerospace/defense industry, mission failure can be catastrophic. Military and space programs rely heavily on the authenticity and reliability of their electronics. Several years ago, a Bloomberg study reported that military jets bound for Afghanistan were outfitted with counterfeit memory chips, putting missions and lives at risk in a hot war zone. A separate report by WND concluded that high-altitude thermal missile sights delivered to the U.S. Army were compromised by counterfeit electronics components, and that the problem is widespread across U.S. defense systems. It’s one of the reasons the Department of Defense (DoD) implemented the Defense Federal Acquisition Regulation Supplement (DFARS) in March 2014 to detect and avoid counterfeit electronic parts. The DFARS provides specifications for robust counterfeit avoidance and response systems, helping Keysight and U.S. government prime contractors meet the most rigorous standards of counterfeit detection, traceability, and response on the products we deliver. The DFARS continues to evolve, and the latest update by the DoD, in August 2016, introduced even stricter requirements for Keysight and prime contractors. Keysight has invested significant time, energy and resources to demonstrate compliance to this regulation inclusive of a full-blown Aerospace/Defense Customer Assessment in late 2015.


Decreased product reliability

A U.S. Senate report on counterfeit electronic components in defense supply chains contains an eye-opening description of the Shantou, China, counterfeiting district, where used electronic parts are extracted from old devices, washed in a river, dried on a riverbank, and sorted on a sidewalk. According to the report, the parts are then collected in “plastic bins filled with expensive brand-name components harvested from scrap printed circuit boards ready for processing.” Compare that to clean-room operations at reputable semiconductor facilities, where the air is one thousand times cleaner than the air in a hospital operating room. It’s the only way to ensure that components and boards are entirely free of dust and contaminants so they function exactly as intended, day after day.


Increased OPEX

Releasing defective products into the marketplace can be costly in a variety of ways. At the very least, handling a failed product increases operating expense by having to deal with Return Merchandise Authorization (RMA) paperwork and processes. Depending on the product, you may incur the cost of the problem and modify manufacturing processes. In extreme cases, shipments might be delayed while you take your design back to the drawing board to determine the root cause of a defect. If the defect is ultimately traced back to a counterfeit component, you’ll have incurred those costs, and taken hits to your profit and reputation, for no reason. This is why Keysight has implemented and maintains a stringent counterfeit avoidance and response system, which we stress-test internally and with our subcontractors in an effort to continue to provide best-in-class instruments.


Our goal at Keysight is to ensure our customers receive authentic, genuine Keysight instruments. It’s not a trivial task in a global company where supply chains stretch to every corner of the world. But we have processes in place to block counterfeit devices from getting into our ecosystem, and to ensure the authenticity of every electronic component we put in our test systems. Illegitimate materials and fraud are real challenges across the industry, and as the counterfeiters get better, we will continue to work even harder to ensure that we stay one step ahead of the bad guys. In my next post, I’ll share some of the ways we’re doing that. Our best practices for addressing counterfeits have made a difference for us and might be useful for your organization as well.

Bill Hewlett used to say that measurement is the key to changing behavior. For example, if a driver wants to consume less gasoline in a car, measuring how driving habits affect gas mileage is the way to go. If a homeowner wants to use less electricity, measuring consumption over the course of a typical 24-hour day would reveal where usage is highest and thus where savings can be made. The “aha” moment—whether for consumers or product engineers—almost always comes from measurement. That’s what ends up driving a big change or breakthrough. Measurement is what brings insight, improvement, and innovation. It’s especially true in electronics because human beings do not have the senses to “measure” variations in electrical phenomena. Instrumentation is the only way to do it.


I’ve seen proof of the connection between measurement and innovation in the high tech world many times over the years. And I know from experience that the bigger the innovation, the stronger the connection. Here are three scenarios where I think it’s vital for manufacturers to “look beyond the catalog” when working with a measurement vendor. In each of these cases, manufacturers found that collaborating with a measurement supplier as a true partner was the key to bringing innovations to market.


Scenario 1: The technology you’re bringing to market goes far beyond existing technologies.

Incremental advances in technology can usually leverage existing test instruments. But big, disruptive advances in technology often require a similar leap forward in test and measurement equipment and processes. One good example is terabit transmission in long-haul optical links. This technology is in high demand and evolving quickly due to the explosive increase in internet traffic. The technological innovation required to achieve terabit transmission speeds was made possible only through the cooperation between measurement companies and manufacturers: Optical modulation technologies could be developed, characterized, and finally produced only by creating a new category of measurement instruments—today’s optical modulation analysis tools. 


Scenario 2: Your innovation crosses domains from one market to another.

There are many examples of innovations that were initially developed for one market or industry and then found their way into other markets. Tech-laden innovations from the electronics manufacturing world can make the jump, but quite often, the cross-over requires input from a measurement vendor to address deployment challenges and measurement problems. For example:  

  • Connected car technology. Most cars today have a radio unit that connects the car through the wireless network to the internet, making these cars essentially fast-moving cell phones. Many car makers have experience designing mechanical and electro-mechanical systems, and can benefit from the deep insight a measurement company brings to the game with rich experience in testing wireless systems. Expertise across both domains is table stakes when developing today’s vehicles.
  • Phased-array antenna technology. Widely deployed in aerospace/defense applications, this technology is now crossing over to the automotive world with breakthroughs in adaptive cruise control, blind-spot detection, and collision avoidance/mitigation systems. The challenge in automotive applications is related to consumer expectations—the technology needs to be small, affordable, and utterly reliable to be viable in consumer applications. While phased-array technology might be new to automotive engineers, it’s well known to measurement vendors who have experience in aerospace/defense. Automotive manufacturers can tap that expertise to more quickly integrate the technology.


A Keysight customer recently made the case for collaboration between manufacturers and test vendors when crossing domains. “With 5G…we’re tapping into an area of the radio spectrum that has been a big unknown for the mobile industry,” said Woojune Kim, vice president of Next-Generation Strategy, Samsung Electronics, in a recent press release. “Being able to work closely with Keysight and leveraging their expertise with network simulation, RF, and millimeter wave technologies is an advantage for our product validation efforts.”


Scenario 3: You need to transform a manufacturing process to meet business goals.

One of our customers, a large Asian manufacturer of lithium-ion battery cells, had a business challenge, a process challenge, and ultimately a measurement challenge for Keysight. The company’s manufacturing process typically yielded a number of battery cells that got flagged for retesting in post-production test. Retesting is expensive: The cells are charged, measurements are taken over a period of a few weeks, and finally the level of discharge is calculated. The expense comes not only in lost time but also in warehousing: cells needed to be stored somewhere while they’re self-discharging, adding inventory expense to the manufacturing process. The customer had a measurement request for Keysight: Could we find a way to reduce their retest cycle from three or four weeks to one hour? Behind it were the twin business goals of reducing inventory and reducing production-related expenses. Our answer was to invent a new product—a self-discharge analyzer—that uses advanced voltage and current matching techniques to measure a battery’s discharge rate in less than an hour. This solution eliminated both the self-discharge wait time and the inventory storage space for retesting batteries. It’s one of my favorite examples of how collaboration between a measurement vendor and a manufacturer can trigger significant innovations—at the business level, the process level, and the product level.  


I’ve seen each of these three scenarios play out over my career. Today I'm more convinced than ever that collaboration is the catalyst for all great innovations in electronics manufacturing. What about you? Where do you think manufacturers and measurement teams are having the biggest impact? Comment on this post and share your story. I’m willing to bet you know additional scenarios where manufacturers and their measurement partners are changing the world.

What does the collaborative power of Lennon & McCartney have to do with 5G communications? Perhaps not much—but I believe their days together yielded far more genius-level work than did their respective solo efforts.


From my early days at Hewlett-Packard and across five generations of wireless evolution, I have witnessed the highest levels of business success when we collaborated closely with industry leaders early in the development of the next generation of, well, anything.


The history of science and engineering is full of famous teams: Louis and Marie Pasteur, Pierre and Marie Curie, the Wright Brothers, and the aforementioned Bill Hewlett and Dave Packard. Even seemingly self-made individuals who changed the course of our collective river—Newton, Leonardo, or Pascal—collaborated with others to exert their own special influence on humanity.


Why collaborate? Why put yourself into a situation that forces you to find common ground, adopt another’s pace, and perhaps share the glory? First of all, nobody has all of the answers. Wilbur needed Orville, and Bill’s technical insight needed Dave’s business acumen.


More important is the multiplicative benefit of teamwork: call it “synergy” or “gestalt.” With leadership and talent, the whole is always greater than the sum of its parts. In addition, by engaging in early collaborations, you build expertise and insight into a market while creating prototype solutions and thereby earning long-term credibility with top players. 


That is what makes a difference in the very complex world of 5G. This new generation will see myriad changes: digital and RF semiconductors; antenna technology; fiber-optic communications; eNB (or gNB?) design; UE/CPE design; networking and applications software; cloud and virtualization; overall system design and interference management; and even a rapidly growing body of work to augment and virtualize our view of reality.


Nearly 2,400 years ago, Aristotle wrote a single volume that described humankind’s complete understanding of physics. Today one person has not a chance of even reading all the books on these topics much less gaining a full understanding our collective knowledge. Even if we focus only on the applications of physics to mobile communications, we have breadth beyond what even a small group of people can embrace.


And so we work together! I have been involved in enough of these interactions to develop this simple recipe for success in technical collaboration: 

  1. Get involved early with the industry leaders.
  2. Ensure both parties have something unique to contribute and something important to gain. This means both parties also take on significant risk.
  3. Live up to your end of the bargain and constantly manage and reconcile expectations—nothing sours a relationship like missed commitments and other bad surprises.
  4. Put the right complementary skillsets on the program, combining brilliantly technical thinkers, rigidly pragmatic project managers, and, of course, out-of-the-box creative types.
  5. Manage a healthy balance of strategic versus commercial intent. Not all collaborations result in a commercial windfall, but good collaborations always provide significant benefit to both sides.


At a corporate level, also think about your overall strategy for collaboration. Our approach has been to consider the following: 

  • Stay global. 5G wireless is happening everywhere. There are differences in approach depending on geography, and the leaders may not be in your backyard.
  • Manage a variety of partners. For 5G this means a mix of university and government research; industry and national consortia; and other commercial entities.
  • Embrace a variety of technologies. 5G means everything from millimeter-wave and semiconductors to antennas and massive MIMO all the way to validation of mobile applications. This exposes you to a broader ecosystem and enables a process of developing higher-value solutions.


As a hobbyist musician, I always perform better when surrounded by the best. Thus, I ask you to remember the emphasis on market leaders. Whether we work with startups, 70-year-old industry icons, universities, or government researchers, our focus is on those leading the way to 5G communications. So let’s work together. Whether it is John with Paul or Bill with Dave, harmony is always richer than solo.

Keysight recently launched a new video featuring Rosana, a young girl who loves to solve problems and became an engineer. I’ve lost count of how many times I’ve watched it, and how many times watching it pulled at my heartstrings. A very strange response to a spectrum analyzer video. See for yourself.


The video’s inherent shout-out to women and underrepresented engineers feels so needed in a time where we still have just 12 percent women engineers working in technology today. And the trend has been flat. This is a business issue and a human issue.  It represents a

significant economic opportunity. In fact, in the recent Decoding Diversity report published by the Dalberg Global Advisory Group, it was noted that “improving ethnic and gender diversity in the U.S. technology workforce represents a massive economic opportunity, one that could create $470-$570Bn in new value for the tech industry, and could add 1.2-1.6% to national GDP.”


The video got me reflecting about when I first decided to become an engineer. As a young girl, despite attending a STEM-focused middle and high school, engineering wasn’t even on my radar. No one in my family had been an engineer and even more so, no one in my family knew what an engineer actually did. Besides, my sights were set on becoming a professional ballet dancer. Fortunately, when I was told I would never be able to dance again at age 16 due to chronic injuries in my legs, I had a female physics teacher who planted a seed about engineering. My path to healing peaked my curiosity around signaling pathways in the body and I sought to understand electrical currents. I ended up heading halfway across the country to study electrical engineering at the University of Illinois at Urbana-Champaign. Little did I know how much I would come to enjoy the learning and the challenges posed, and be grateful for the possibilities that awaited me after graduation.


Some girls (and boys) know at a young age that they wish to become an engineer. And the challenge is keeping them interested and confident that they can succeed. Others have not yet even considered the possibility or have perhaps discredited the suggestion. Having the positive encouragement and role model of a female physics teacher in high school made all the difference showing me the possibility and giving me the needed confidence.


The importance of role models is something I’ve been deeply contemplating as I’ve become mother to two young children and am troubled by the dearth of children’s literature to read and inspire them with smart, positive female characters. Where are their role models? What kind of gender roles am I teaching my son and my daughter about what girls and boys can be, and how they can contribute to society?


This subject of role models is getting a lot of attention this year, especially following the box office success of Hidden Figures, based on the book of the same title. The story shares the untold contributions of three brilliant mathematicians who were critical to the NASA Space Race and were also African-American women.  The movie has begun a movement bringing mainstream attention to the dramatic underrepresentation of women and racial/ethnic minorities in STEM fields.


Who are the role models you wish to see?  Who might you have wished you had as role models?

I welled up with tears when I first viewed the Keysight video. We need more of these stories. Rosana will certainly be a role model for my two children and hopefully for other young girls. It is up to us, women and men in engineering, to help one another and the younger generation explore the possibilities. To be positive role models and present what is possible, encouraging more young girls and boys to consider and pursue engineering, so that they know, regardless of gender, they can become whatever they want.

You are sitting down to your meeting and the precious hours you spent crafting your message to this important audience have resulted in a presentation resting safely on your laptop computer. You are dressed for success and your audience hungers for the rich fruit of your perspective.


The meeting starts on time... but then 15 miserable minutes pass while you and three other impromptu IT “experts” struggle to find the right port, adaptor, and monitor setting to connect your laptop to the projector. If you have never experienced this problem, allow me to suggest that you probably do not exist.


Verging on revolution

Are we incompetent operators of IT tools? No. We simply lack a single standard, and I have little hope for convergence. On the other hand, in 5G we are on the verge of something completely revolutionary—a single and globally deployed standard for mobile communications.


Since the earliest days of radio, smart people have formed standards organizations to ensure that Marconi’s magic could be applied in a manner enabling us to communicate from afar. A quick perusal of the internet will yield fascinating tales surrounding the standardization of Morse code, radio channels, distress channels (and even “SOS” itself), and spectrum management.


Many were developed during predecessors of today’s ITU meetings—and those meetings produced documents that read remarkably like those created today. From 2G forward, we had global standards for cellular communications, but we did not have the potential of a single standard until we reached the fourth generation—and that convergence was forced to cower while the WiMAX/LTE duality threatened the peace of the mobile world for a few tense years.


Patiently progressing toward 5G communications

Members of the 3GPP have been working in concert now for more than a year (and ITU for longer than that) to define a fifth-generation standard—the most ambitious development in mobile communications since the advent of analog cellular. Gaining alignment around the globe and across segments of our industry requires difficult technical work hashed out in long meetings, frustrating discussions, email rants, and legal battles. All of this is amongst a demographic of mostly engineers and mathematicians, and our little technical club is not known for its smooth social skills.


I do not mean to belittle standards work. Those of you who are not associated with such bodies may be surprised at their scope and breadth. 3GPP has three technical specification groups, each responsible for several technical working groups that develop the details of the specifications. This means around 1,500 people in 20 committees meeting up to eight times annually generating massive amounts of documentation distilled from tens of thousands of technical submissions.


Allow me to provide some perspective: a colleague who attends 3GPP RAN4 recently sent a copy of “3GPP TR 38.803 v2.0.0.” This is a 200-page, 11 MB feasibility study on the radio frequency and coexistence aspects of the new 5G wireless air interface. This work-in-progress document represents just one part of one part of one part of one part of the gestating standard: during this sub-group’s last meeting no fewer than 37 documents were submitted for consideration for just the topic of radio testability (one of my favorite topics).


The recent 3GPP decision to accelerate the standard comes after a yearlong argument. I will discuss the ramifications in a later post but suffice to say this was driven by a discussion of the tradeoffs relating to enabling new business models, standards “fragmentation,” and the risk of a standard that falls too far short of the 5G communications vision we see so beautifully portrayed in every company’s 5G presentation.


Wrestling with conflicting urges

I continue to be overwhelmed by the technical and commercial demands of creating and deploying these standards. As a consumer, I look forward to the wireless standards being as unwavering as the color of traffic lights and certainly more consistent than interfacing with various display projectors. As a supplier of simulation, design, test, and measurement solutions, I must admit that the past 20 years have created wonderful business opportunities spawned by the fragmentation of standards.


It is thus difficult for me to find a neutral space here. But while it will take a few years, I believe it will happen… You tell me—will we truly end up with a single global standard?