Skip navigation
All Places > Keysight Blogs > Insights Unlocked > Blog
1 2 3 Previous Next

Insights Unlocked

35 posts

Data analytics is an emerging technology that is getting a lot of attention in the press these days. As with many exciting technologies, there is a mix of real opportunity surrounded by a lot of hype. While it is sometimes difficult to separate the two, I am among those who believe data analytics can make your business run better. As with any technology development, though, a positive outcome starts with a clear definition of the question you are trying to answer.

 

We can start with this one: Which tangible business results can come from data analytics? For most technology-intensive companies, one key driver is getting the right new product to market quickly and efficiently. The benefits are faster time-to-market, reduced risk and lower costs. In addition, topline results will improve when data analytics is used to optimize product plans and customer engagement.

 

Deloitte posted an article that suggests many companies are finding value in data analytics but, because these are the early days, there’s more insight yet to come. One early conclusion: the key benefit of analytics is “better decision-making based on data” (Figure 1).

Figure 1. Summary of findings from Deloitte’s Analytics Advantage Survey (pg 4, The Analytics Advantage: We’re just getting started)

 

Drowning in data, grasping for scalability

Companies that create electronic products are part of the overall trend toward data analytics. In a recent report, Frost and Sullivan sees growing demand for big data analytics applied in the test and measurement market. Keysight is part of this ecosystem, and our design and test solutions generate critical data from measurements of product performance.

 

We see many of our customers swimming in this data, and some are drowning in it. There are so many data sources that it is easy to accumulate a bunch of disjoint files that are poorly organized and difficult to manage.

 

This is typical, and it is why most large data analytics projects currently involve way more investment in data collection than in actual analysis. It is estimated that 80% of the effort goes into data collection and “data wrangling.” To me, “data wrangling” is the perfect phrase because it conjures up images of a cowboy tossing a rope around a spreadsheet in hopes of subduing it.

 

Many electronics firms have created homegrown solutions, ranging from simple collections of Excel files to complex software systems coded in C. Spreadsheet wrangling can work well for small, localized datasets—but it won’t scale up because data is isolated among individual users or test stations, perhaps spread across multiple sites. Revision control may be weak and it can be difficult to ensure that you have the most recent data. What’s worse, it usually turns into lost productivity as key technical staff spends time fiddling with all those spreadsheets. Over time, this maps to lost velocity towards finishing and shipping the product.

 

One alternative is reaching out to the IT department to create a more robust system. The resulting solution will be more scalable and supportable, but it also has internal costs. For one, changes fall to the IT team, robbing resources from other priorities. This is workable as long as all ongoing IT projects are fully supported and staffed.

 

Taking baby steps toward better data for better results

The actual analytics required can often be very basic. Sure, we’d like to turn on some machine-learning application that derives brilliant insight from our manufacturing test line and then feeds it back into the next design revision.

 

More likely, we are just trying to look at product performance in a consistent way so we can tell if the design is performing correctly. This is especially true in the prototype phase, when there are fewer devices to evaluate and the actual specification is still in flux. Later, in the manufacturing phase, we usually have plenty of data but it may still be siloed at the various production line stations or stored at another site, perhaps at a contract manufacturer.

 

Getting better at clarifying the target problem

As you apply the ideas discussed above, you will get better at defining the business problem you want to solve using in-hand design and test data. It may be improved time-to-market, lower R&D cost, better production yield, or something more specific to your operation. The next crucial step is creating or acquiring scalable tools that enable you to get your data under control.

 

My questions for you: Do you see this challenge in your business? What sources of data feel a bit disorganized or maybe completely out of control? Which tools have been most useful? We will be exploring these ideas in future blog posts, so stay tuned for additional insights.

There was no celebration: in September, I attended my thirty-first public technical symposium in the guise of Keysight’s 5G program manager. From Tampa to Tel Aviv to Taipei, whether organized by IEEE, NTIA, GSMA, or IMT2020, such events have taught me a few things about myself and many more about how our 5G technology community manages social gatherings.

 

Please allow me a quick analysis. I sort these events into three buckets: “technically rich,” “overtly commercial,” and “government promotional.” Here are my snapshots of each type.

 

Technically rich: By turns exhausting or invigorating

I like technical, but these are simply not enjoyable when they drift into academic opacity. Still, many provide opportunities for rich dialogue with others in the industry. Three events come to mind: IEEE MTT/IMS; the IWPC meetings; and the recent International Symposium on Advanced Radio Technologies (ISART) convocation in Colorado. I exit these with a rejuvenated curiosity and a refreshed perspective about the amazing technical brains powering the communications business.

 

ISART was an especially impressive mix of policy makers, mobile communications experts, and satellite industry representatives. I learned a great deal about millimeter-wave (the primary topic) and gained insight into how some institutions work. I also picked up a few tidbits on spectrum policy:

 

  • News to me, part 1: The ITU is part of the United Nations, and ITU spectrum decisions are international treaties. Among UN organizations, the ITU is unique in allowing the participation of commercial entities.
  • News to me, part 2: The FCC, which gets a lot of press, and the NTIA, which gets very little, are sister organizations. The former manages (among other things) spectrum for commercial use; the latter manages spectrum for federal use.
  • The real story: I had assumed that the spectrum conflict between mobile and satellite was strictly technical, centered on the risk of interference. Not so: the most recent Upper Microwave Flexible Use Service (UMFUS) report and order (R&O) from July 2016 is the source of discord because the FCC wants to reduce the risk of interference by placing tight restrictions on the location of large ground-based satellite gateways (i.e., terrestrial links to satellite constellations). Jennifer Manner of EchoStar suggested these rules are not even practical.

 

Overtly commercial: Have you read our press release?

GSMA’s Mobile World Congress in Barcelona is the most prominent example. While I have had excellent discussions with some of our key customers at this event, I sheepishly admit that MWC has additional appeal because it is a great excuse to explore and enjoy one of my favorite cities. On the other side of the Atlantic, the GSMA teamed up with CTIA for the first time to create Mobile World Congress Americas 2017. Although I would love to write about the event, I was not there; rather, I was in Taiwan, attending the third kind of gathering...

 

Government promo shows: Not purely self-promotion

The 4th Taipei 5G Summit was a two-day event organized by a group within the Taiwan Ministry of Economic Affairs. It was coupled with the 21st World Conference on Information Technology. I had the honor of speaking at this event and focused my talk on getting the audience to think about a sampling of measurement and validation challenges in 5G New Radio (NR).

 

Taiwan is an interesting case for 5G communications in that its indigenous mobile operators will be very cautious about investing in 5G—a reluctance driven particularly by the failure of Taiwan’s WiMAX business model. However, Taiwan-based multinational technology giants like Hon Hai, TSMC, WNC, Quanta, and Pegatron will all take full advantage of the global investments in 5G technologies. Based on what I saw from both National Taiwan University and National Chiao Tung University, it is clear that academia is also fully engaged in a very impressive manner.

 

Among the many highlights from the event was a presentation by Tareq Amin of Reliance Jio. Mr. Amin deftly detailed how Jio completely changed its technology investment paradigm to implement a financially stable LTE network in India, a country with an ARPU of about $2 (vs. about $60 in the USA). Ordinarily, I resent sitting through presentations that are thinly veiled sales pitches (the Taiwan Summit had a few of those). Mr. Amin did indeed talk about Jio’s success: it achieved #1 LTE penetration in the world in seven months, processing some 7 petabytes per month. However, his real message was about innovation that follows from a drastic change in perspective when confronted with unprecedented boundary conditions. It was the most inspiring talk I have heard perhaps all year.

 

What have been your experiences?

Here I expose myself to comments from those of you whom have had to listen to my talks. What inspires you at these events? Will 5G be successful in your environment? What will it take?

When Google employee James Damore sent out an internal memo questioning his company’s diversity policies, he likely didn’t expect that it would go viral. It did. In its wake, a firestorm of discussion emerged in Silicon Valley and beyond—one that still rages today. Whether you agree with James’ assessment of gender bias and diversity in the workplace or not, one thing we can all agree on is that it brought renewed focus to a timely topic, one that should be just as important to men as it is to women.

It’s a topic I’ve come to know well during the span of my 35-year career in the tech industry, and one that I recently had a chance to reflect on during an interview I gave to Society for Women Engineers (SWE). During that time, my personal journey has taken me from a man not aware of my unconscious bias extended toward women, to one who now serves as their full-on diversity partner. It’s a journey I’m hoping more men will take.

 

Becoming an advocate for women

For me, that journey began back in the 90’s. I was a young manager at the time, fairly new to management in fact. I had been trying to promote women in my organization and because of my efforts, received an invitation to the HP Technical Women’s Conference in Silicon Valley. I was one of only two men in attendance.

During the event, I’d sit in a room with what seemed like 5,000 women and listen to speakers talk about unconscious bias. I watched in amazement as every woman in the room universally agreed that it was a problem. I had never thought about it before—how my unconscious bias toward women could actually be contributing to the problem. That idea and many of the other issues the speakers spoke about weren’t even on my radar. It was a wake-up call.

That realization is just as relevant today as it was back in the 90’s. Consider a 2012 study (Moss-Racusin et al., 2012) designed to examine how applicants were evaluated for a lab manager position. Faculty were presented identical applications for the position, the only difference being that candidates were given obvious male or female names. Despite having the same qualifications, both male and female faculty routinely rated female student candidates lower than their male counterparts. The female applicants were seen as less hirable and offered lower salaries. The study revealed what many of us already know—there’s a clear unconscious bias against women in the workplace.

What I learned about unconscious bias from that conference I attended all those years ago stayed with me throughout my career. It helped transform me into an active and full-on diversity partner with women. What’s more, I make it my goal to ensure other managers and emerging leaders can attend similar conferences in the hopes that they too will have the same experience and transformation. 

 

Turning awareness into action

The experience alone is not enough. Being aware of unconscious bias is certainly the first step in helping to overcome it, but for that awareness to make any real difference it has to lead to tangible action. In other words, men have to become allies with and advocates for women, whether in engineering or any other profession.

What can you do to be an ally for women in the workplace?

 

1.  Uncover your hidden biases about gender diversity in the workplace.

You don’t have to attend a conference like I did to find out if you have an unconscious bias toward women. All you need to do is take a quick and easy online Implicit Association Test (IAT), such as the Gender-Career or Gender-Science IAT.

 

2. Watch a video on techniques for ending sexism in the workplace.

In just a short 10-minute time span, you can watch a video that will tell you about 5 easy things you can do today to promote diversity in the workplace. Watch “5 Ways Men Can Help End Sexism”.

 

3. Start you own personal diversity action plan.

A diversity action plan is a written plan of the actions you personally intend to take to promote gender equity and diversity in the workplace. My plan involves a number of different things. As a manager, I am vigilant in assigning new projects or tasks based on skill, but also with a mindset toward providing growth opportunities for women engineers. I look for ways to help them grow and gain higher visibility.

 

I also actively go through my data on employee pay equity once a year. I examine job category, employee gender, where they work, how much they make, etc. to make sure that all employees are paid equitably and that we have a good female-male split. If adjustments need to be made, we determine how best to do that. And, I try and make sure that all engineers, women and men, are set for success with good work/life balance.

One of the great tools we have to promote work/life balance for our employees at Keysight is the Hidden Valley Elementary K-3 school, built on Keysight’s land in Santa Rosa, California. Parents at the Santa Rosa site have the flexibility to join their kids for lunch, participate in school activities, or even help out in the classroom during the day. It’s been key in ensuring our engineers feel that they can have both a family and career.

 

Another critical part of my personal action plan is serving as a member of the Keysight - Society of Women Engineers Enterprise Program (KSWEEP). KSWEEP is an organization within Keysight designed to support the efforts of the Society for Women Engineers (SWE). We do this by sponsoring employees to attend SWE conferences and providing communities, networking, professional development and outreach opportunities throughout the year. We also sponsor local activities and recently helped SWE expand its global footprint into Penang, Malaysia.

 

And, because I feel strongly that it’s the responsibility of male senior and executive leaders to help the next generation of male engineers learn about gender equality and diversity issues, I’ve made it my goal to find ways to actively encourage those leaders to take up that role and become part of the solution. One solution I found was to bring one of my top male engineering managers to last year’s SWE conference.

 

Those actions are all part for my personal action plan. Your plan might look quite different. You could choose to:

  • Talk with your female colleagues and really listen to what they might convey and offer about their gender-related experiences in the workplace. Seek opportunities where you gain such access.
  • Ensure female colleagues have equal time to speak at meetings and be sure to share information equally with female and male colleagues. Minimize mansplaining.
  • Make sure female colleagues get included and invited to informal work gatherings.
  • Demonstrate your commitment to gender equity to your colleagues. Take corrective action when you notice gender inequities and bias. Your silence otherwise makes it acceptable.
  • Nominate women for awards, honors and work positions, when and where applicable.
  • Start a work committee designed to encourage men too to act as an ally for gender equity in the workplace. If one already exists, volunteer to help.
  • Recognize that women often have disproportionate responsibilities for child and elderly parent care. Turn that awareness into action by supporting a work/life balance and doing things like planning meetings with consideration for your teams personal/family schedules.

Find your voice

Whatever you chose to include in your personal action plan, the point is to do something. Don’t be part of the problem, be part of the solution. It’s a journey we can all take together, and one that promises to positively transform not only the workplace, but society in general.

 

You can start on your own personal journey by sharing this blog post with your colleagues. Use it to open up a discussion on gender bias and diversity. As I did all those years ago, you may just find yourself really seeing the issue for the first time, and deciding to become an active diversity partner for women in your workplace.

Some businesses view instrument calibration as a best practice, others view it as a compliance issue driven by industry standards, contractual agreements, or certification requirements. In all cases, there’s a decision to be made when it comes to instrument calibration. Does it make more sense to do it yourself or outsource it? I’ve discussed the pros and cons with manufacturing executives around the world, and while there’s a case to be made for both sides, I see six strong arguments for outsourcing equipment calibration services.   

 

1. Faster turn around

While an internal calibration lab can sometimes move faster than an outsource team, I find that outsourcing consistently gets faster results, especially when you do a true apples-to-apples comparison. For example, one of our large aerospace/defense customers had an in-house calibration lab that had a two-week service-level agreement (SLA) with their internal customers for all calibration services. The internal calibration lab appeared to meet their SLA consistently. But on deeper investigation, it was discovered that only the simpler tools, such as torque wrenches, were being calibrated within the two-week window. More sophisticated instruments that measured RF and microwave systems were actually taking four to six weeks for lab calibration. By outsourcing, the customer was able to reduce their SLA to just one week for all instrument calibration services, even for the most complex instruments.

 

2. Lower OPEX

 In the example above, when a complex instrument required four to six weeks for calibration, the internal calibration lab was forced to use spares and rentals to meet the two-week window specified in their SLA. Spares and rentals cost them millions of dollars per year. By outsourcing, they not only reduced their turn-around time by 50 percent, but also reduced their annual operating expenses by about 20 percent by completely eliminating the cost of rentals, storage space for spares, and the man-hours required to procure, manage, and maintain the excess inventory. I’ve seen the same scenario play out for other customers as well. In my experience, an OPEX savings of 10 to 25 percent is common when customers shift from in-house to outsourced instrument calibration services and compare all costs, not just specific calibration costs.

 

3. Lower CAPEX

 Some companies excel at in-house calibration, but for most, it’s a departure from their core business. Dedicated outsource vendors, on the other hand, provide calibration services as a core competency, and make a substantial capital investment to deliver those services. Whereas budget limitations might force an internal calibration lab to use older, slower hardware at the edge of its lifecycle, a reputable calibration vendor will have the latest hardware and software and dedicated teams that know how to use it. This calibration-as-a-service business model removes CAPEX impacts as well as lifecycle concerns for manufacturers, and provides access to state-of-the-art calibration capabilities at all times.

 

4. Future-proofed test

 Competitive manufacturers should be free to chase new opportunities without worrying about lab calibration capabilities or having to spend capital to upgrade a calibration lab as the business evolves. For example, many wireless manufacturers are now making the leap from 4G to 5G wireless, which means a lot of next-generation test equipment is being introduced into design and manufacturing environments. Older calibration hardware and software isn’t designed for the higher frequency ranges of 5G, but in an outsourcing model, that’s not the manufacturer’s problem. The calibration vendor is on the hook to stay ahead of 5G test requirements. It’s a CAPEX benefit for manufacturers since they don’t have to purchase and learn new calibration equipment. More importantly, it’s a competitive advantage since they can invest their energy and resources in being first to market rather than trying to be calibration experts.

 

5. Improved audit compliance

 Since most manufacturers now have global supply chains, it’s common for test equipment to be calibrated in one location but used in another. This sometimes makes it difficult to find records and documentation for ISO audits and other record-keeping functions. A reputable outsource partner will maintain records in a centralized database as part of the SLA between the outsource calibration lab and the customer. The records are available 24/7 regardless of where the instrument calibration services are performed, so compliance can always be proven and audits are less of a headache.

 

6. Scalable resourcing

 With natural attrition of an aging workforce, it can be hard to keep qualified calibration engineers on staff. And in many companies, engineers who retire are not replaced. Even if a new job requisition is opened, it’s nearly impossible to replace the specialized skills needed in a calibration lab. By comparison, outsource vendors have a pool of dedicated resources who are trained on the latest calibration equipment. Manufacturers have access to a scalable resource pool without having to staff up or down based on business conditions.

 

I’ve had many conversations with calibration lab managers and manufacturing executives over the years, and I know that outsourcing is not for everyone. If your product line is stable and not evolving, if your SLAs are truly being met, or if CAPEX is not an issue, then an internal calibration lab is probably fine. But if emerging technologies or new business opportunities are creating a need for capital investments to upgrade your internal calibration lab, you should do a financial audit to determine the true cost of outsourcing versus keeping it in house. Make sure you take into account not only hardware, floor space, and SLAs but also the ongoing expertise required to write automated calibration procedures to get the turnaround you need. If you do decide to outsource, it pays to do your homework on the quality of the calibrations you’ll be receiving. Not all calibration partners are created equal. Choose one that knows its stuff, and good luck.

 

Eric Taylor is Vice Present & General Manager of Managed Services Division – West. Read his bio at https://community.keysight.com/people/et3333.

5G has been picking up speed faster than a downhill racer—and may seem similarly on the hairy edge of control. From Mobile World Congress (MWC) in Barcelona to the latest 3GPP meetings, the buzz is growing louder around topics ranging from fixed wireless to “what’s our next business model?” With apologies for going dark since late March, the following Top Five Hot Topics update is aimed at keeping you abreast of the latest.

 

1: Fixed-wireless systems are the first step to millimeter-wave 5G

Even with NEMs and device makers anxious to be part of the 2018 winter showcase in South Korea, the first highly visible commercialization will be Verizon’s fixed-wireless system. For two years, VZW has been stalwart in its intent to commercialize in 2017 using its “pre-5G” 5G-TF specification. However, this spec is unique and has some fundamental differences to 3GPP’s New Radio (NR), and orphaned technology is a very real risk. On the plus side, VZW is gaining a first-mover advantage by learning how to enable commercial access without introducing the complexity of mobility.

 

In response, AT&T has also stated it will deploy a 5G mmWave fixed service, one that will be based on NR. This is why AT&T pushed very hard for the acceleration of the non-standalone (NSA) version of NR.

 

2: 3GPP accelerating NSA enables NR to happen before R15

“Before R15” means late 2017. With NSA, the 5G radio access network (RAN) will be controlled by the LTE core and hence LTE can be used to set up calls, manage initial access, and even provide fallback capability if the 5G RAN link fails.

 

The combined effect of Verizon’s decisions and Korea Telecom’s push to commercialize 5G in 2019 drove the industry to ultimately agree to the acceleration. The resulting commercial play will occur somewhat sooner but, in some ways, will be more risky than originally planned.

 

3: Greenfield spectrum between 2.8 and 5.0 GHz outside the US

5G wireless technology is not just millimeter-wave. Outside the US there is open spectrum between 2.8 and 5.0 GHz that is getting serious attention from major operators in China, Japan, South Korea and several European countries.

 

Because this is ideal territory for 5G Massive MIMO, much of this will occur in TDD spectrum. It is not clear how, when, or whether, operators will reallocate 3G and 4G spectrum to NR—but the territory between 400 MHz and 3 GHz may undergo a change in the middle or late 2020s based on the relative success of NR in terms of its spectral and energy efficiency.

 

4: Rapid virtualization of networks will drive flexibility

In April 2016, AT&T announced that its evolved packet core (EPC) was all virtual. Demos at MWC suggested that virtualizing the RAN could reach all the way down to the upper MAC layer. This means network equipment becomes computers, wires (or fiber), and remote radio heads—likely with enough on-board ASIC power to manage the lower MAC, Layer 1, and PHY.

 

This can enable software-defined networks (SDN), self-organizing networks (SONs) and, further, network slicing. This last item had many compelling demos at MWC and enables operators to create, perhaps in real-time, network slices manifest as software entities. These have different KPIs, depending on what each slice will do for its associated client and business model. For example, a high-reliability slice could take network resources from other slices in order to meet service level agreement (SLA) criteria. While this will prompt serious looks at new business models, it appears to be a case of technology moving faster than policy because one could suggest this is a step towards violation of net neutrality (in those places where it still exists...).

 

5: Business cases for 5G wireless investment are still under scrutiny

There is still much skepticism about where the money will come from. Debates continue to rage about why any operator would significantly invest simply to develop better capability for a client base with declining average revenue per user (ARPU). Yet, new concepts abound: 

  • Third party pays: This is a big one that could expand mobile advertising (e.g., Google). One can envision combining this with entertainment schemes that could function only on a fifth-generation network (think “PokemonGo! 2020”).
  • Automotive/connected car: There are numerous ideas here, and many are associated with third party pays—especially those related to infotainment and entertainment. Most of the hype is related to the facilitation of real-time navigation and automated driving; however, I believe this will happen more slowly than ad- or entertainment-related applications due to policy, technology, and sociological issues.
  • Industrial or municipal: IoT can save billions for municipalities and commercial entities that need to collect data remotely on a real-time basis. This could enable machine learning to manage things even as mundane as parking and garbage management, or open the door to serious data analytics. 

Those are my Top Five Hot Topics. What other trends or developments are you seeing?

 

In my first post, I described how counterfeits can be found anywhere and that Keysight can no longer ignore this fact. Our customers know this too, which is why they pressure their suppliers to put systems in place to avoid the introduction of any suspect material into their supply chains. Keysight solved this customer challenge by working with one of our most demanding customers—the Aerospace Defense Industry. By working together in a transparent manner, we ended up developing a government-grade counterfeit avoidance and response system to meet the US Department of Defense’s exacting DFARS (Defense Federal Acquisition Regulation Supplement) requirements.

 

  1. Ensure traceability. It’s imperative to be able to trace a product back as far as possible to the raw material. Lack of traceability is a huge risk indicator. Electronic components typically use date or lot codes, and if available, certificates of conformance.

 

  1. Buy from authorized channels. This may seem like a no-brainer, but some deals appear too good to pass up. And if you find yourself in a line-down situation, you may find yourself considering the grey market. Buying direct from the original component manufacturer or from one of their franchised distributors is the best upfront mechanism to vet suspect material. Once you move outside of these spaces, you enter the Open Market, which is fraught with risk and bad actors looking to make a quick buck. The best thing you can do is “know your supplier.”

 

  1. Put a response system in place.What data must be collected and who needs to be informed if a suspected counterfeit problem pops up? This response system must address internal concerns (failure in a Keysight instrument; evidence of rogue procurement from a bad actor), as well as industry alerts (GIDEP alert notifying industry that counterfeits have been found). Ownership of the issue from beginning to end is critical.

 

  1. Qualify a few reputable independent distributors. In the electronics world, independent distributors play a unique role in helping to locate and test for legitimate material. Strong independents differentiate themselves from pure brokers with membership to electronics organizations such as ERAI. They are typically set up with labs to perform robust visual and often electrical inspections. Force your purchasing arm to use only these qualified independents when facing a shortage. Keysight actually allowed one of our most demanding Aerospace Defense customers to audit our approved independents.

 

  1. Design for supply chain. This should go without saying, but do not design a product with components that are nearing the end of their lifecycle. This leads to large lifetime buys and exposure to the grey market. Along similar lines, consider buffering inventory for strategic sole-sourced material.

 

  1. Flow requirements down through your supply chain. You can add multiple layers of security by flowing counterfeit avoidance and response requirements down through your supply chain, whether to a contract manufacturer, semiconductor manufacturer, vineyard, or stockyard. Don’t assume anything. Trust but verify.

 

  1. Consider a tabletop exercise and be willing to go through a customer audit. Keysight performed an extensive internal walkthrough process and then volunteered to go through a three-day customer audit of our counterfeit avoidance and response system. This was followed by visits to two of our Independent distributors. This process demonstrated to our customer that we were serious about keeping the supply chain clean and serious about caring for our customers.

 

Although these practices were developed for the demanding Aerospace Defense industry, they can apply to a range of industries, from food to wine to pharmaceuticals to clothing and many more. Most industries deal with counterfeits in one way or another, and every competitive company on the planet takes its reputation and its relationship with customers seriously. With apologies to Groucho, that’s something you can’t afford to fake.

Our small town recently held an electronics recycling day to help families get rid of unwanted printers, laptops, TVs, and other devices. Just bring them to the local school and the devices get recycled for free. Great idea, right? Clean out your garage and promote sustainability at the same time!

 

Reduce and “re-use” is the prevailing term today, whether for used electronics or grocery bags. However, there’s a dark side to these electronics recycling events with unintended consequences that affect manufacturers and consumers around the globe. Today’s news is full of stories about used and counterfeit components finding their way into supposedly new, state-of-the-art products. In fact, counterfeit goods are everywhere these days, accounting for nearly half a trillion dollars in sales each year, or nearly 2.5 percent of all global imports. Electronic components are the single biggest piece of the counterfeit pie, generating billions of dollars a year for ethically challenged parts suppliers. It’s a pervasive problem with serious implications for original equipment manufacturers and their customers.  Here are the top four impacts of counterfeits.

 

End-user injuries

If you buy a Rolex watch on a street corner for $25, nobody gets hurt. But if you get an incredible deal on a hoverboard, don’t give it to your kid and definitely don’t try bringing it on an airplane. Last year, over 16,000 counterfeit hoverboards were seized by customs officials in Chicago after a well-documented series of fires injured people using the two-wheeled skateboards. In other news, a report by the U.S. Senate Armed Services Committee was profiled in Forbes a few years ago, citing the presence of counterfeit electronic components in military and commercial aircraft. As the author of the article states, how comfortable would you be on your next commercial flight if you suspected that a key onboard system may contain suspected counterfeit parts? Thankfully airplanes have multiple layers of redundancy, but that doesn’t change the fact that real people and real lives are affected by counterfeit components in our increasingly electronics-centric world.

 

Critical mission failure

No matter what you make as a manufacturer, mission failure is always bad. However, in the tech-laden aerospace/defense industry, mission failure can be catastrophic. Military and space programs rely heavily on the authenticity and reliability of their electronics. Several years ago, a Bloomberg study reported that military jets bound for Afghanistan were outfitted with counterfeit memory chips, putting missions and lives at risk in a hot war zone. A separate report by WND concluded that high-altitude thermal missile sights delivered to the U.S. Army were compromised by counterfeit electronics components, and that the problem is widespread across U.S. defense systems. It’s one of the reasons the Department of Defense (DoD) implemented the Defense Federal Acquisition Regulation Supplement (DFARS) in March 2014 to detect and avoid counterfeit electronic parts. The DFARS provides specifications for robust counterfeit avoidance and response systems, helping Keysight and U.S. government prime contractors meet the most rigorous standards of counterfeit detection, traceability, and response on the products we deliver. The DFARS continues to evolve, and the latest update by the DoD, in August 2016, introduced even stricter requirements for Keysight and prime contractors. Keysight has invested significant time, energy and resources to demonstrate compliance to this regulation inclusive of a full-blown Aerospace/Defense Customer Assessment in late 2015.

 

Decreased product reliability

A U.S. Senate report on counterfeit electronic components in defense supply chains contains an eye-opening description of the Shantou, China, counterfeiting district, where used electronic parts are extracted from old devices, washed in a river, dried on a riverbank, and sorted on a sidewalk. According to the report, the parts are then collected in “plastic bins filled with expensive brand-name components harvested from scrap printed circuit boards ready for processing.” Compare that to clean-room operations at reputable semiconductor facilities, where the air is one thousand times cleaner than the air in a hospital operating room. It’s the only way to ensure that components and boards are entirely free of dust and contaminants so they function exactly as intended, day after day.

 

Increased OPEX

Releasing defective products into the marketplace can be costly in a variety of ways. At the very least, handling a failed product increases operating expense by having to deal with Return Merchandise Authorization (RMA) paperwork and processes. Depending on the product, you may incur the cost of the problem and modify manufacturing processes. In extreme cases, shipments might be delayed while you take your design back to the drawing board to determine the root cause of a defect. If the defect is ultimately traced back to a counterfeit component, you’ll have incurred those costs, and taken hits to your profit and reputation, for no reason. This is why Keysight has implemented and maintains a stringent counterfeit avoidance and response system, which we stress-test internally and with our subcontractors in an effort to continue to provide best-in-class instruments.

 

Our goal at Keysight is to ensure our customers receive authentic, genuine Keysight instruments. It’s not a trivial task in a global company where supply chains stretch to every corner of the world. But we have processes in place to block counterfeit devices from getting into our ecosystem, and to ensure the authenticity of every electronic component we put in our test systems. Illegitimate materials and fraud are real challenges across the industry, and as the counterfeiters get better, we will continue to work even harder to ensure that we stay one step ahead of the bad guys. In my next post, I’ll share some of the ways we’re doing that. Our best practices for addressing counterfeits have made a difference for us and might be useful for your organization as well.

Bill Hewlett used to say that measurement is the key to changing behavior. For example, if a driver wants to consume less gasoline in a car, measuring how driving habits affect gas mileage is the way to go. If a homeowner wants to use less electricity, measuring consumption over the course of a typical 24-hour day would reveal where usage is highest and thus where savings can be made. The “aha” moment—whether for consumers or product engineers—almost always comes from measurement. That’s what ends up driving a big change or breakthrough. Measurement is what brings insight, improvement, and innovation. It’s especially true in electronics because human beings do not have the senses to “measure” variations in electrical phenomena. Instrumentation is the only way to do it.

 

I’ve seen proof of the connection between measurement and innovation in the high tech world many times over the years. And I know from experience that the bigger the innovation, the stronger the connection. Here are three scenarios where I think it’s vital for manufacturers to “look beyond the catalog” when working with a measurement vendor. In each of these cases, manufacturers found that collaborating with a measurement supplier as a true partner was the key to bringing innovations to market.

 

Scenario 1: The technology you’re bringing to market goes far beyond existing technologies.

Incremental advances in technology can usually leverage existing test instruments. But big, disruptive advances in technology often require a similar leap forward in test and measurement equipment and processes. One good example is terabit transmission in long-haul optical links. This technology is in high demand and evolving quickly due to the explosive increase in internet traffic. The technological innovation required to achieve terabit transmission speeds was made possible only through the cooperation between measurement companies and manufacturers: Optical modulation technologies could be developed, characterized, and finally produced only by creating a new category of measurement instruments—today’s optical modulation analysis tools. 

 

Scenario 2: Your innovation crosses domains from one market to another.

There are many examples of innovations that were initially developed for one market or industry and then found their way into other markets. Tech-laden innovations from the electronics manufacturing world can make the jump, but quite often, the cross-over requires input from a measurement vendor to address deployment challenges and measurement problems. For example:  

  • Connected car technology. Most cars today have a radio unit that connects the car through the wireless network to the internet, making these cars essentially fast-moving cell phones. Many car makers have experience designing mechanical and electro-mechanical systems, and can benefit from the deep insight a measurement company brings to the game with rich experience in testing wireless systems. Expertise across both domains is table stakes when developing today’s vehicles.
  • Phased-array antenna technology. Widely deployed in aerospace/defense applications, this technology is now crossing over to the automotive world with breakthroughs in adaptive cruise control, blind-spot detection, and collision avoidance/mitigation systems. The challenge in automotive applications is related to consumer expectations—the technology needs to be small, affordable, and utterly reliable to be viable in consumer applications. While phased-array technology might be new to automotive engineers, it’s well known to measurement vendors who have experience in aerospace/defense. Automotive manufacturers can tap that expertise to more quickly integrate the technology.

 

A Keysight customer recently made the case for collaboration between manufacturers and test vendors when crossing domains. “With 5G…we’re tapping into an area of the radio spectrum that has been a big unknown for the mobile industry,” said Woojune Kim, vice president of Next-Generation Strategy, Samsung Electronics, in a recent press release. “Being able to work closely with Keysight and leveraging their expertise with network simulation, RF, and millimeter wave technologies is an advantage for our product validation efforts.”

 

Scenario 3: You need to transform a manufacturing process to meet business goals.

One of our customers, a large Asian manufacturer of lithium-ion battery cells, had a business challenge, a process challenge, and ultimately a measurement challenge for Keysight. The company’s manufacturing process typically yielded a number of battery cells that got flagged for retesting in post-production test. Retesting is expensive: The cells are charged, measurements are taken over a period of a few weeks, and finally the level of discharge is calculated. The expense comes not only in lost time but also in warehousing: cells needed to be stored somewhere while they’re self-discharging, adding inventory expense to the manufacturing process. The customer had a measurement request for Keysight: Could we find a way to reduce their retest cycle from three or four weeks to one hour? Behind it were the twin business goals of reducing inventory and reducing production-related expenses. Our answer was to invent a new product—a self-discharge analyzer—that uses advanced voltage and current matching techniques to measure a battery’s discharge rate in less than an hour. This solution eliminated both the self-discharge wait time and the inventory storage space for retesting batteries. It’s one of my favorite examples of how collaboration between a measurement vendor and a manufacturer can trigger significant innovations—at the business level, the process level, and the product level.  

 

I’ve seen each of these three scenarios play out over my career. Today I'm more convinced than ever that collaboration is the catalyst for all great innovations in electronics manufacturing. What about you? Where do you think manufacturers and measurement teams are having the biggest impact? Comment on this post and share your story. I’m willing to bet you know additional scenarios where manufacturers and their measurement partners are changing the world.

What does the collaborative power of Lennon & McCartney have to do with 5G communications? Perhaps not much—but I believe their days together yielded far more genius-level work than did their respective solo efforts.

 

From my early days at Hewlett-Packard and across five generations of wireless evolution, I have witnessed the highest levels of business success when we collaborated closely with industry leaders early in the development of the next generation of, well, anything.

 

The history of science and engineering is full of famous teams: Louis and Marie Pasteur, Pierre and Marie Curie, the Wright Brothers, and the aforementioned Bill Hewlett and Dave Packard. Even seemingly self-made individuals who changed the course of our collective river—Newton, Leonardo, or Pascal—collaborated with others to exert their own special influence on humanity.

 

Why collaborate? Why put yourself into a situation that forces you to find common ground, adopt another’s pace, and perhaps share the glory? First of all, nobody has all of the answers. Wilbur needed Orville, and Bill’s technical insight needed Dave’s business acumen.

 

More important is the multiplicative benefit of teamwork: call it “synergy” or “gestalt.” With leadership and talent, the whole is always greater than the sum of its parts. In addition, by engaging in early collaborations, you build expertise and insight into a market while creating prototype solutions and thereby earning long-term credibility with top players. 

 

That is what makes a difference in the very complex world of 5G. This new generation will see myriad changes: digital and RF semiconductors; antenna technology; fiber-optic communications; eNB (or gNB?) design; UE/CPE design; networking and applications software; cloud and virtualization; overall system design and interference management; and even a rapidly growing body of work to augment and virtualize our view of reality.

 

Nearly 2,400 years ago, Aristotle wrote a single volume that described humankind’s complete understanding of physics. Today one person has not a chance of even reading all the books on these topics much less gaining a full understanding our collective knowledge. Even if we focus only on the applications of physics to mobile communications, we have breadth beyond what even a small group of people can embrace.

 

And so we work together! I have been involved in enough of these interactions to develop this simple recipe for success in technical collaboration: 

  1. Get involved early with the industry leaders.
  2. Ensure both parties have something unique to contribute and something important to gain. This means both parties also take on significant risk.
  3. Live up to your end of the bargain and constantly manage and reconcile expectations—nothing sours a relationship like missed commitments and other bad surprises.
  4. Put the right complementary skillsets on the program, combining brilliantly technical thinkers, rigidly pragmatic project managers, and, of course, out-of-the-box creative types.
  5. Manage a healthy balance of strategic versus commercial intent. Not all collaborations result in a commercial windfall, but good collaborations always provide significant benefit to both sides.

 

At a corporate level, also think about your overall strategy for collaboration. Our approach has been to consider the following: 

  • Stay global. 5G wireless is happening everywhere. There are differences in approach depending on geography, and the leaders may not be in your backyard.
  • Manage a variety of partners. For 5G this means a mix of university and government research; industry and national consortia; and other commercial entities.
  • Embrace a variety of technologies. 5G means everything from millimeter-wave and semiconductors to antennas and massive MIMO all the way to validation of mobile applications. This exposes you to a broader ecosystem and enables a process of developing higher-value solutions.

 

As a hobbyist musician, I always perform better when surrounded by the best. Thus, I ask you to remember the emphasis on market leaders. Whether we work with startups, 70-year-old industry icons, universities, or government researchers, our focus is on those leading the way to 5G communications. So let’s work together. Whether it is John with Paul or Bill with Dave, harmony is always richer than solo.

Keysight recently launched a new video featuring Rosana, a young girl who loves to solve problems and became an engineer. I’ve lost count of how many times I’ve watched it, and how many times watching it pulled at my heartstrings. A very strange response to a spectrum analyzer video. See for yourself.

 

The video’s inherent shout-out to women and underrepresented engineers feels so needed in a time where we still have just 12 percent women engineers working in technology today. And the trend has been flat. This is a business issue and a human issue.  It represents a

significant economic opportunity. In fact, in the recent Decoding Diversity report published by the Dalberg Global Advisory Group, it was noted that “improving ethnic and gender diversity in the U.S. technology workforce represents a massive economic opportunity, one that could create $470-$570Bn in new value for the tech industry, and could add 1.2-1.6% to national GDP.”

 

The video got me reflecting about when I first decided to become an engineer. As a young girl, despite attending a STEM-focused middle and high school, engineering wasn’t even on my radar. No one in my family had been an engineer and even more so, no one in my family knew what an engineer actually did. Besides, my sights were set on becoming a professional ballet dancer. Fortunately, when I was told I would never be able to dance again at age 16 due to chronic injuries in my legs, I had a female physics teacher who planted a seed about engineering. My path to healing peaked my curiosity around signaling pathways in the body and I sought to understand electrical currents. I ended up heading halfway across the country to study electrical engineering at the University of Illinois at Urbana-Champaign. Little did I know how much I would come to enjoy the learning and the challenges posed, and be grateful for the possibilities that awaited me after graduation.

 

Some girls (and boys) know at a young age that they wish to become an engineer. And the challenge is keeping them interested and confident that they can succeed. Others have not yet even considered the possibility or have perhaps discredited the suggestion. Having the positive encouragement and role model of a female physics teacher in high school made all the difference showing me the possibility and giving me the needed confidence.

 

The importance of role models is something I’ve been deeply contemplating as I’ve become mother to two young children and am troubled by the dearth of children’s literature to read and inspire them with smart, positive female characters. Where are their role models? What kind of gender roles am I teaching my son and my daughter about what girls and boys can be, and how they can contribute to society?

 

This subject of role models is getting a lot of attention this year, especially following the box office success of Hidden Figures, based on the book of the same title. The story shares the untold contributions of three brilliant mathematicians who were critical to the NASA Space Race and were also African-American women.  The movie has begun a movement bringing mainstream attention to the dramatic underrepresentation of women and racial/ethnic minorities in STEM fields.

 

Who are the role models you wish to see?  Who might you have wished you had as role models?

I welled up with tears when I first viewed the Keysight video. We need more of these stories. Rosana will certainly be a role model for my two children and hopefully for other young girls. It is up to us, women and men in engineering, to help one another and the younger generation explore the possibilities. To be positive role models and present what is possible, encouraging more young girls and boys to consider and pursue engineering, so that they know, regardless of gender, they can become whatever they want.

You are sitting down to your meeting and the precious hours you spent crafting your message to this important audience have resulted in a presentation resting safely on your laptop computer. You are dressed for success and your audience hungers for the rich fruit of your perspective.

 

The meeting starts on time... but then 15 miserable minutes pass while you and three other impromptu IT “experts” struggle to find the right port, adaptor, and monitor setting to connect your laptop to the projector. If you have never experienced this problem, allow me to suggest that you probably do not exist.

 

Verging on revolution

Are we incompetent operators of IT tools? No. We simply lack a single standard, and I have little hope for convergence. On the other hand, in 5G we are on the verge of something completely revolutionary—a single and globally deployed standard for mobile communications.

 

Since the earliest days of radio, smart people have formed standards organizations to ensure that Marconi’s magic could be applied in a manner enabling us to communicate from afar. A quick perusal of the internet will yield fascinating tales surrounding the standardization of Morse code, radio channels, distress channels (and even “SOS” itself), and spectrum management.

 

Many were developed during predecessors of today’s ITU meetings—and those meetings produced documents that read remarkably like those created today. From 2G forward, we had global standards for cellular communications, but we did not have the potential of a single standard until we reached the fourth generation—and that convergence was forced to cower while the WiMAX/LTE duality threatened the peace of the mobile world for a few tense years.

 

Patiently progressing toward 5G communications

Members of the 3GPP have been working in concert now for more than a year (and ITU for longer than that) to define a fifth-generation standard—the most ambitious development in mobile communications since the advent of analog cellular. Gaining alignment around the globe and across segments of our industry requires difficult technical work hashed out in long meetings, frustrating discussions, email rants, and legal battles. All of this is amongst a demographic of mostly engineers and mathematicians, and our little technical club is not known for its smooth social skills.

 

I do not mean to belittle standards work. Those of you who are not associated with such bodies may be surprised at their scope and breadth. 3GPP has three technical specification groups, each responsible for several technical working groups that develop the details of the specifications. This means around 1,500 people in 20 committees meeting up to eight times annually generating massive amounts of documentation distilled from tens of thousands of technical submissions.

 

Allow me to provide some perspective: a colleague who attends 3GPP RAN4 recently sent a copy of “3GPP TR 38.803 v2.0.0.” This is a 200-page, 11 MB feasibility study on the radio frequency and coexistence aspects of the new 5G wireless air interface. This work-in-progress document represents just one part of one part of one part of one part of the gestating standard: during this sub-group’s last meeting no fewer than 37 documents were submitted for consideration for just the topic of radio testability (one of my favorite topics).

 

The recent 3GPP decision to accelerate the standard comes after a yearlong argument. I will discuss the ramifications in a later post but suffice to say this was driven by a discussion of the tradeoffs relating to enabling new business models, standards “fragmentation,” and the risk of a standard that falls too far short of the 5G communications vision we see so beautifully portrayed in every company’s 5G presentation.

 

Wrestling with conflicting urges

I continue to be overwhelmed by the technical and commercial demands of creating and deploying these standards. As a consumer, I look forward to the wireless standards being as unwavering as the color of traffic lights and certainly more consistent than interfacing with various display projectors. As a supplier of simulation, design, test, and measurement solutions, I must admit that the past 20 years have created wonderful business opportunities spawned by the fragmentation of standards.

 

It is thus difficult for me to find a neutral space here. But while it will take a few years, I believe it will happen… You tell me—will we truly end up with a single global standard?

I hate trade shows. I love Barcelona. I recently found myself once again in this juxtaposition. And thus, I will expound upon “the state of the industry as seen through the eyes of the MWC attendee” and force you to read yet another review of 2017’s Mobile World Congress (MWC).

 

Scale foretells opportunity

According to Ericsson’s Mobility Report, humankind consumes eight exabytes (8´1018!!) of mobile data every month—so the massive scale of GSMA’s flagship event should not be a surprise. But even we regular attendees of the show are overwhelmed by the 100,000-plus attendee list and the breadth of business represented by the 2,300 exhibitors. And I do mean “breadth.” Even though the industry manufactures between two and three billion mobile phones every year (over 100 every second) the show is hardly dominated by those showing off new mobile devices or infrastructure.

 

According to Cheetan Sharma, over-the-top (OTT) revenues surpassed access revenues in 2014. This business opportunity in applications and content was immediately obvious in Barcelona with heavy focus on connected cars and gaming. Most of the major automotive companies and their technology supply-chain were front and center, showing 5G applications relating to connected vehicles and automated or semi-autonomous driving.

 

New and virtualized architectures impress

Virtualized and flexible mobile networks will enable these new applications and new business models. I saw very impressive demonstrations of network slicing enabled by new software architecture combined with scalable high-speed general-purpose processing hardware configurations. From my earliest days in 5G, it was obvious that the biggest part of the 5G revolution will be the new and virtualized network architectures. Even without the hype this was evident at MWC this year.

 

But there is policy-related tension on this topic. A dramatic departure from the previous US administration’s approach to net neutrality became evident during incoming FCC Chairman Ajit Pai’s comments: “The private sector has spent $1.5 trillion since 1996 to deploy broadband infrastructure…We would not have seen such innovation if, during the 1990s, the government had treated broadband like a railroad or a water utility.”

 

It is not clear how this will become manifest in the US or whether other countries will follow suit; however, the implications are that operators and even their supply chains—those who will make the largest capital investments in 5G technology —could end up on better footing if they can avoid being regulated as utilities.

 

Analytic software and machine learning step forward

This emphasis on 5G “wireless” solutions based in software is growing in two other areas. There were significant demonstrations of analytic software and machine learning to be applied to the mountains of data generated by IoT. The scope of this ecosystem, made clear in the first paragraphs above, indicates ripe opportunity to leverage all that information for everything from business optimization to entirely new business models. Coupled with this was significant attention to security issues relating to protection of privacy, prevention of threats to system performance and availability, and preemption of misuse of wireless systems.

 

The (innovation) game is afoot

In a future blog post I will review William Webb’s rather skeptical book, The 5G Myth: When Vision Decoupled From Reality. At the risk of revealing my position, I will close with this: In a panel discussion involving mobile operators and their network suppliers, the CTO of one large operator was asked which one of the network equipment manufacturers would most benefit from 5G technology. He stated that such a company was not on the stage. Rather, it was a startup somewhere on the exhibit floor—one that was innovating and iterating at a pace he had not seen in prior deployment generations.

 

Virtualized network architectures and OTT business models requiring large technology investments: where will the money come from? You may recall my comments about Pokémon GO. Snap, the maker of an application used primarily by teenagers to share photos, is now public and valued at over $25 billion. Did you see that coming five years ago?

 

The scope of business models enabled by a faster, more flexible network and empowered by very sophisticated devices means opportunity for the innovative that can leverage the massive scale we see manifest at the world’s biggest show of its kind.

 

Whether you were in the throng or followed from afar, what stood out to you? Where do you think we will be when MWC 2018 rolls around?

 

The electronics industry is buzzing with the Internet of Things (IoT). One engineer described IoT as a vast conspiracy to put network connections on every object in the world—completely independent of whether it makes sense or not.

 

When we start connecting up all of these “things,” new system requirements creep in that may not be obvious. Your “thing” is no longer happily independent: it is part of a larger system. With connectivity comes new responsibility—in product design and on into the product’s useful life as a network citizen.

 

From a developer’s point of view, many IoT devices are “embedded designs.” This means they use microprocessors and software to implement functionality, often with an eye towards small compute footprint, just-enough processing power, lean software design and demanding cost targets. This is familiar territory for many design engineers, perhaps with the added requirement of providing a robust network connection. This is often going to be a wireless connection, be it Wi-Fi or some other common standard.

 

It’s unlikely that you are designing, defining or controlling that entire system. That raises a few key questions, starting with the issue of interoperability: How do you know that your device is going to be compatible with others on the network? Your device may also present a security risk to the network: How much protection can you build in? The assumedly wireless connection is both a source and receiver of RF emissions: How do you make sure it behaves properly in both roles?

 

Building on these topics, here are a few things to consider as you try to avoid the ills of this brave new world of everything connected.

 

Ensuring interoperability

Start by understanding the requirements of the larger system as they apply to your product. What other devices, compute resources, servers, etc., must you interoperate with? Because these systems are inherently complex and multivendor, all IoT devices need to “play well with others,” as described in Systems Computing Challenges in the Internet of Things. Creating a robust test strategy will help ensure that you have this covered. My prescription: Think like a systems engineer.

 

Managing potential security risks

Recent hacks have provided a much-needed wake-up call concerning IoT security. Low-cost devices may not have much data to protect, but they can be a giant security hole that tempts the bad guys with access to the rest of the network. Distributed denial-of-service (DDOS) attacks have been launched using simple devices such as netcams, and even lightbulbs have been hacked. (For some interesting thoughts on this topic, see IoT Problems Are about Psychology, Not Technology.)

 

Manufacturers that leave gaping security holes may face legal consequences: in early January, the Federal Trade Commission filed a complaint against D-Link for inadequate security measures in certain of its wireless routers and internet cameras.

 

You can start by plugging known holes in operating systems and other platforms. And, of course, don’t hardcode default passwords. Think through better ways to test for security problems: no, it isn’t easy—but it’s starting to feel mandatory. My prescription: Think like an IT engineer... or a hacker.

 

Preparing for interference

Your device probably uses one of many wireless connections. First, cover the basics of electromagnetic compatibility (EMC). Be sure the device passes relevant radiated-emission standards so it isn’t spewing RF that interferes with other devices. Also, ensure that your device isn’t susceptible to emissions from other sources.

 

It’s also important to consider the RF environment your device will live in—especially if unlicensed spectrum is being used, as in the case of Wi-Fi. The great thing about the unlicensed airwaves is that they’re free to everyone to use. The really bad thing about unlicensed spectrum is that everyone is using it and it’s a free-for-all. Thus, you’ll likely have to contend with other emitters in the same frequency span. See Chris Kelly’s post Preparing Your IoT Designs for the Interference of Things for some additional thoughts. My prescription: Think like an EMC engineer.

 

Wrapping up

My key point: IoT devices may seem familiar and comfortable to engineers who work with embedded systems—but this is a trap. Avoiding this trap starts with a shift in perspective, viewing IoT devices as citizens on a network that has critical requirements for systems, security and RF behavior. Without this shift, we will expose our products—and ultimately our companies—to customer dissatisfaction, customer returns and product liability.

You have been tasked with leading a team to go after this 5G business. Your strategic imperatives include success and leadership in 5G and you are an essential part of making it happen. Your management, your team, your C-suite, your board of directors, and perhaps even your family, are all counting on you to deliver.

 

No pressure. Just do it, right?

 

Before drawing up your battle plans and starting the assault, press pause and ask a few crucial questions. After almost 32 years in high tech, mostly in engineering management, I have found that one’s team often has a clear sense of the success-factors, enablers, and roadblocks on the road ahead. Check in and get their input. Even if you are two years down that road, there is still time to make course corrections.

 

Do you have the right background and expertise?

For many, 5G applications imply new technologies. This might mean designing for carrier frequencies and bandwidths that seem like black magic to most of your engineers. Have you considered all of the things they will have to learn or acquire? How many lessons will be acquired the hard way—by making mistakes and discovering, too late, that there will be a dreadful impact to your schedule or costs? How quickly can your team pivot when it needs to gain new or different expertise?

 

The 5th generation wireless technology may also require new business models. This could entail a foray into open-source software (or hardware) when your organization has never bothered with this sort of thing in the past. It might point you toward selling services or upgrades rather than making net-30 sales. How might you adjust your design team’s skills to suit this new paradigm?

 

Do you have the right tools?

Some of those new technical areas will require new tools for your team. Is it new hardware? Is it RF chambers of a different size? A new computer system focused on fast, efficient handling of much more data? Could you use EDA tools to reduce hardware turns? Which ones? And, with a nod to the preceding entry, how long will it take for your team to use them efficiently and effectively?

 

Are you closely connected to your key customers?

I am a fan of the agile software manifesto, especially the notion of putting designers close to their customers and providing rapid and frequent updates while embracing regular and rapid (if not capricious) changes in requirements.

 

In an industry driven by consumer fad, these elements are critical to all of your development activities—software, hardware and business. Without close ties to your most important customers (the ones with the most money, not the loudest voices) you will be too slow to address needs, anticipate changes, and respond.

 

Be sure you are talking to the right people in your key accounts. I once watched a major project fail because the team was getting its guidance from the wrong individuals. After significant investment on both sides, the ones who were really in charge appeared and unceremoniously shut down the entire program.

 

Is your timing consistent with theirs?

The recipe for a fabulously successful project is probably familiar to you: your project is synchronized with your two or three best lead customers and perfectly aligned with their project timing, and market demand is simultaneously growing. This magical combination is, to some extent, the result of luck—but we often make our own luck since the good lady “prefers the prepared mind.”

 

Do you have the support you need from your organization?

This is hardly unique to 5G, but we all need this reminder. I have occasionally been frustrated by subordinate managers who, after producing inadequate results, have said to me, “If only we would have had A or B…” to which my response was, “Why didn’t you ask?” I am embarrassed to admit I have made the same mistake myself.

 

Talk to your team and find out what they think would make the difference. They may be keenly aware of some bit of organizational support that will improve their chances of success; sometimes, it will be easy to get that help. Even if you can’t secure everything they want, the process will likely clear enough roadblocks to speed things along. (Keysight’s Sigi Gross offers a related perspective in point #3 of his 7/13/16 post.)

 

Where to go from here?

No pressure: just ask the right questions and act on the most actionable and efficacious answers. There is no substitute for the innovation associated with a well-prepared, customer-connected, and motivated team to overcome the other challenges.

 

Any other cautions, questions or stories you would care to share?

Future projections for the Internet of things (IoT) are staggering: billions of devices around the world, with 500 residing in a typical home. That’s why the wireless designers I know are preparing their devices for the Interference of Things—and they’re doing it early in the development process. This is especially important for IoT widgets intended for mission-critical applications.

 

A recent customer visit drove this point home. They had released a new healthcare-related IoT device that uses Wi-Fi to send information to a centralized database. Even though testing in the lab went well, the device was consistently losing connections due to interference from myriad onsite devices—literally 913 devices on Wi-Fi alone in one hospital. When we met, our customer had an urgent need to resolve this problem and recover from the figurative black eye it had received from hospital administrators.

 

Troubleshooting with 20:20 hindsight

After the fact, avoidable problems in receivers, transmitters and the RF environment may suddenly seem obvious. In a receiver, blocking is related to its ability to tolerate strong or local signals on adjacent channels that use the same standard or protocol. When an adjacent signal is too strong, the receiver becomes so desensitized that it can’t recognize an on-channel signal.

 

An ill-behaved transmitter has a poor adjacent-channel power ratio (ACPR) when too much of its signal spills into neighboring channels. That’s one reason why we see ACPR in virtually every wireless standard. In a noisy wireless environment like a hospital, better ACPR helps improve overall system behavior.

 

In today’s airwaves, one issue keeps cropping up: the overabundance of standards- and non-standards-based devices clogging up the industrial, scientific and medical (ISM) band at 2.4 GHz. Devices that use different standards simply see each other as interference and thus don’t play well together.

 

Using foresight to design for interference

Long before a wireless device enters field trials, developers can explore potential issues using the latest design, simulation and test tools. These enable such extensive wringing-out—ranging from obvious to insidious—that a manufacturer can send its prototype into the field with much greater confidence that it will perform as intended.

 

Of course, there’s an important reality check: Where does your IoT device reside on the continuum between mission-critical and throwaway? The economics change significantly if your device trends towards either end of that spectrum.

 

The IoT device in my story would have fared much better in the Interference of Things if it had been debugged early on using a more thorough approach to design, simulation, emulation, test and analysis. An integrated IoT solution is a crucial step toward overcoming likely IoT challenges—and setting up your IoT end users for success.

 

What approaches have you used to set yourself up for success in IoT and other wireless products?