Skip navigation
All Places > Keysight Blogs > Insights Unlocked > Blog > 2017 > September
2017

Some businesses view instrument calibration as a best practice, others view it as a compliance issue driven by industry standards, contractual agreements, or certification requirements. In all cases, there’s a decision to be made when it comes to instrument calibration. Does it make more sense to do it yourself or outsource it? I’ve discussed the pros and cons with manufacturing executives around the world, and while there’s a case to be made for both sides, I see six strong arguments for outsourcing equipment calibration services.   

 

1. Faster turn around

While an internal calibration lab can sometimes move faster than an outsource team, I find that outsourcing consistently gets faster results, especially when you do a true apples-to-apples comparison. For example, one of our large aerospace/defense customers had an in-house calibration lab that had a two-week service-level agreement (SLA) with their internal customers for all calibration services. The internal calibration lab appeared to meet their SLA consistently. But on deeper investigation, it was discovered that only the simpler tools, such as torque wrenches, were being calibrated within the two-week window. More sophisticated instruments that measured RF and microwave systems were actually taking four to six weeks for lab calibration. By outsourcing, the customer was able to reduce their SLA to just one week for all instrument calibration services, even for the most complex instruments.

 

2. Lower OPEX

 In the example above, when a complex instrument required four to six weeks for calibration, the internal calibration lab was forced to use spares and rentals to meet the two-week window specified in their SLA. Spares and rentals cost them millions of dollars per year. By outsourcing, they not only reduced their turn-around time by 50 percent, but also reduced their annual operating expenses by about 20 percent by completely eliminating the cost of rentals, storage space for spares, and the man-hours required to procure, manage, and maintain the excess inventory. I’ve seen the same scenario play out for other customers as well. In my experience, an OPEX savings of 10 to 25 percent is common when customers shift from in-house to outsourced instrument calibration services and compare all costs, not just specific calibration costs.

 

3. Lower CAPEX

 Some companies excel at in-house calibration, but for most, it’s a departure from their core business. Dedicated outsource vendors, on the other hand, provide calibration services as a core competency, and make a substantial capital investment to deliver those services. Whereas budget limitations might force an internal calibration lab to use older, slower hardware at the edge of its lifecycle, a reputable calibration vendor will have the latest hardware and software and dedicated teams that know how to use it. This calibration-as-a-service business model removes CAPEX impacts as well as lifecycle concerns for manufacturers, and provides access to state-of-the-art calibration capabilities at all times.

 

4. Future-proofed test

 Competitive manufacturers should be free to chase new opportunities without worrying about lab calibration capabilities or having to spend capital to upgrade a calibration lab as the business evolves. For example, many wireless manufacturers are now making the leap from 4G to 5G wireless, which means a lot of next-generation test equipment is being introduced into design and manufacturing environments. Older calibration hardware and software isn’t designed for the higher frequency ranges of 5G, but in an outsourcing model, that’s not the manufacturer’s problem. The calibration vendor is on the hook to stay ahead of 5G test requirements. It’s a CAPEX benefit for manufacturers since they don’t have to purchase and learn new calibration equipment. More importantly, it’s a competitive advantage since they can invest their energy and resources in being first to market rather than trying to be calibration experts.

 

5. Improved audit compliance

 Since most manufacturers now have global supply chains, it’s common for test equipment to be calibrated in one location but used in another. This sometimes makes it difficult to find records and documentation for ISO audits and other record-keeping functions. A reputable outsource partner will maintain records in a centralized database as part of the SLA between the outsource calibration lab and the customer. The records are available 24/7 regardless of where the instrument calibration services are performed, so compliance can always be proven and audits are less of a headache.

 

6. Scalable resourcing

 With natural attrition of an aging workforce, it can be hard to keep qualified calibration engineers on staff. And in many companies, engineers who retire are not replaced. Even if a new job requisition is opened, it’s nearly impossible to replace the specialized skills needed in a calibration lab. By comparison, outsource vendors have a pool of dedicated resources who are trained on the latest calibration equipment. Manufacturers have access to a scalable resource pool without having to staff up or down based on business conditions.

 

I’ve had many conversations with calibration lab managers and manufacturing executives over the years, and I know that outsourcing is not for everyone. If your product line is stable and not evolving, if your SLAs are truly being met, or if CAPEX is not an issue, then an internal calibration lab is probably fine. But if emerging technologies or new business opportunities are creating a need for capital investments to upgrade your internal calibration lab, you should do a financial audit to determine the true cost of outsourcing versus keeping it in house. Make sure you take into account not only hardware, floor space, and SLAs but also the ongoing expertise required to write automated calibration procedures to get the turnaround you need. If you do decide to outsource, it pays to do your homework on the quality of the calibrations you’ll be receiving. Not all calibration partners are created equal. Choose one that knows its stuff, and good luck.

 

Eric Taylor is Vice Present & General Manager of Managed Services Division – West. Read his bio at https://community.keysight.com/people/et3333.

5G has been picking up speed faster than a downhill racer—and may seem similarly on the hairy edge of control. From Mobile World Congress (MWC) in Barcelona to the latest 3GPP meetings, the buzz is growing louder around topics ranging from fixed wireless to “what’s our next business model?” With apologies for going dark since late March, the following Top Five Hot Topics update is aimed at keeping you abreast of the latest.

 

1: Fixed-wireless systems are the first step to millimeter-wave 5G

Even with NEMs and device makers anxious to be part of the 2018 winter showcase in South Korea, the first highly visible commercialization will be Verizon’s fixed-wireless system. For two years, VZW has been stalwart in its intent to commercialize in 2017 using its “pre-5G” 5G-TF specification. However, this spec is unique and has some fundamental differences to 3GPP’s New Radio (NR), and orphaned technology is a very real risk. On the plus side, VZW is gaining a first-mover advantage by learning how to enable commercial access without introducing the complexity of mobility.

 

In response, AT&T has also stated it will deploy a 5G mmWave fixed service, one that will be based on NR. This is why AT&T pushed very hard for the acceleration of the non-standalone (NSA) version of NR.

 

2: 3GPP accelerating NSA enables NR to happen before R15

“Before R15” means late 2017. With NSA, the 5G radio access network (RAN) will be controlled by the LTE core and hence LTE can be used to set up calls, manage initial access, and even provide fallback capability if the 5G RAN link fails.

 

The combined effect of Verizon’s decisions and Korea Telecom’s push to commercialize 5G in 2019 drove the industry to ultimately agree to the acceleration. The resulting commercial play will occur somewhat sooner but, in some ways, will be more risky than originally planned.

 

3: Greenfield spectrum between 2.8 and 5.0 GHz outside the US

5G wireless technology is not just millimeter-wave. Outside the US there is open spectrum between 2.8 and 5.0 GHz that is getting serious attention from major operators in China, Japan, South Korea and several European countries.

 

Because this is ideal territory for 5G Massive MIMO, much of this will occur in TDD spectrum. It is not clear how, when, or whether, operators will reallocate 3G and 4G spectrum to NR—but the territory between 400 MHz and 3 GHz may undergo a change in the middle or late 2020s based on the relative success of NR in terms of its spectral and energy efficiency.

 

4: Rapid virtualization of networks will drive flexibility

In April 2016, AT&T announced that its evolved packet core (EPC) was all virtual. Demos at MWC suggested that virtualizing the RAN could reach all the way down to the upper MAC layer. This means network equipment becomes computers, wires (or fiber), and remote radio heads—likely with enough on-board ASIC power to manage the lower MAC, Layer 1, and PHY.

 

This can enable software-defined networks (SDN), self-organizing networks (SONs) and, further, network slicing. This last item had many compelling demos at MWC and enables operators to create, perhaps in real-time, network slices manifest as software entities. These have different KPIs, depending on what each slice will do for its associated client and business model. For example, a high-reliability slice could take network resources from other slices in order to meet service level agreement (SLA) criteria. While this will prompt serious looks at new business models, it appears to be a case of technology moving faster than policy because one could suggest this is a step towards violation of net neutrality (in those places where it still exists...).

 

5: Business cases for 5G wireless investment are still under scrutiny

There is still much skepticism about where the money will come from. Debates continue to rage about why any operator would significantly invest simply to develop better capability for a client base with declining average revenue per user (ARPU). Yet, new concepts abound: 

  • Third party pays: This is a big one that could expand mobile advertising (e.g., Google). One can envision combining this with entertainment schemes that could function only on a fifth-generation network (think “PokemonGo! 2020”).
  • Automotive/connected car: There are numerous ideas here, and many are associated with third party pays—especially those related to infotainment and entertainment. Most of the hype is related to the facilitation of real-time navigation and automated driving; however, I believe this will happen more slowly than ad- or entertainment-related applications due to policy, technology, and sociological issues.
  • Industrial or municipal: IoT can save billions for municipalities and commercial entities that need to collect data remotely on a real-time basis. This could enable machine learning to manage things even as mundane as parking and garbage management, or open the door to serious data analytics. 

Those are my Top Five Hot Topics. What other trends or developments are you seeing?