Skip navigation
All Places > Keysight Blogs > Next Generation Wireless Communications > Blog > 2018 > May
2018

The Internet of Things (IoT) is changing everything and it’s not just the interaction between humans and machines.

 

As with any new or emerging technology, the changes it creates often ripple through society in multiple ways. It may transform the way businesses operate and even, the goods and services they deliver. It may transform existing industries with new use cases and dramatically improve processes. And, it will undoubtedly transform workforces and the skillsets required of that workforce. It’s happening today in the IoT.

 

The IoT is comprised of a vast worldwide ecosystem of devices, wireless communications, networks, and infrastructure. It spans many industries and encompasses many different vertical markets, including: automotive, industrial, medical, smart city, smart home, and wearables. Each market has its own requirements and challenges, and so do the IoT applications within each of these markets.

 

Successfully navigating this dynamic ecosystem is no easy task. It requires designers, manufacturers, network operators, and service providers with skills finely tuned for the IoT.

 

They must know what challenges lie ahead, regardless of whether they are designing the next great wearable device or putting a network in place to support the many IoT devices within a hospital environment. They must understand the nuances of the environments in which their devices and networks will operate. And just as critically, they must learn how to use the resources at their disposal to overcome any challenges and meet any requirements to develop IoT devices, wireless communications and networks that can thrive in the real world.

 

With such a large expanse of diversity in the IoT, learning all this critical information can be difficult, and at times, overwhelming. Many designers, manufacturers, network operators, and service providers simply don’t know where to go for the information they need.

 

If you are faced with this problem, attending formal training courses and seminars, or reading up on the latest IoT-related article, books, and eBooks is always a good place to start. It’s also smart to reach out to your trusted solutions vendors to get their insight on challenges and requirements in the IoT. In addition, these vendors can help you learn what solutions are at your disposal for addressing those challenges.

 

However, if what you are really after is more specialized information on how to do specific IoT tasks, then the IoT Education Hub may be just what you need. The IoT Education Hub gives you free online access to the latest educational resources on IoT device testing, wireless communications test, and network and system test. You'll gain access to valuable “how-to” information like how to:

  • Measure current drain
  • Deal with the interference between medical IoT devices
  • Deliver consistent measurements for IoT designs
  • Build resilient security into your network
  • Unmask network and data evasions
  • And much more...

 

Check out the IoT Education Hub: Device Test, IoT Education Hub: Wireless Communications, and IoT Education Hub: Network and System today to jumpstart your development efforts. A little knowledge can go a long way in helping you more quickly realize the promise of the IoT.   iotdevicesecurity iot iotdevicesecurity iotdevicetest

I recently attended the Brooklyn 5G Summit 2018 with luminaries such as Thomas Marzetta, Arun Ghosh and Marty Cooper. Over the two day conference, these innovative minds and their peers impressed with insightful talks about the latest technology that will make 5G successful. I was much amused by the fact that by the end of the conference, only one brave soul admitted to actually knowing what 5G was… and a significant amount of attendees were already heralding 6G!

 

Using 5G to improve human existence

However we choose to define 5G, the conference reminded me that there are many different drivers that play a role in making 5G a reality. Some of these are commercial in nature and others revolve around the satisfaction of solving complex technical problems. Marty Cooper – the inventor of the first portable handset and the first person to make a call on a portable cell phone in April 1973 – shared his vision, which is that we will use the technology to improve human existence and solve problems associated with poverty, healthcare and education. He pointed out that thanks to the mobile industry with all its talented and driven engineers, 1 billion people in Africa have moved out of poverty in the past 20 years.

 

Creative minds love to solve tough challenges

But not everyone draw their energy from knowing that the technology they’re part of developing will help to improve people’s lives. Creative minds love to solve tough challenges and 5G will definitely serve up some interesting technical challenges around mmWave, Massive MIMO, Beamforming, etc.

 

Karolina Eklund with Thomas Marzetta, the originator of Massive MIMOThomas Marzetta, Distinguished Industry Professor Electrical and Computer Engineering  at the NYU Tandon School of Engineering elaborated on this topic: ‘Massive MIMO, in its ultimate embodiment, is most likely to take place in the hugely valuable sub-5 GHz spectrum, driven by the staggering throughput and latency requirements of ubiquitous VR/AR. Although a complete reduction of Massive MIMO to commercial practice has not yet happened, already a fundamental question has emerged: Will future developments in wireless physical layer technology be limited to incremental improvements upon Massive MIMO, or are genuine breakthroughs still possible through some as-yet undiscovered principles of operation?’

 

Several speakers, including Melissa Arnoldi, President for Technology & Operations at AT&T were excited about offering vertical industries (e.g. Autonomous Vehicles) and consumers (e.g. AR/VR) use cases only achievable with ultra low latencies. These will be delivered by deploying ‘secure, flexible networks that operate at the edge’ by virtualizing the network and by deploying cloud technology and mobile edge computing (MEC). “In the short term, MEC is the key technology that will enable low latencies and in many cases also security. This will also bring machine learning based intelligence to wireless networks, which in our opinion will be the next big thing in mobile technologies,” commented Matti Latva-aho, Academy Professor at University of Oulu.  

 

New business opportunities in vertical industries

But you may not be that excited about the prospect of applying your insights, education and grey matter to solve complex technical challenges. Instead, the idea of creating consumer value through new use cases such as VR, AR, 5K video and the commercial possibilities these will bring to your business might get your juices going! Marcus Weldon, CTO & President of Nokia Bell Labs, discussed some of the commercial drivers that will make 5G a success. He claimed that 5G technology will increase productivity, lower costs and create new business opportunities in a wide range of vertical industries. Other speakers chimed in – vertical industries will be the ‘killer app’ that will make 5G successful!

 

Weldon pointed out that we are reaching the limits of TCO due to operational complexity. However, 5G technology will deliver fully automated network slicing across the RAN and core to address a diverse set of requirements, which will lower the cost of delivering high speed data and in turn open up new applications and drive new use cases – many of which are not yet known. Mikael Hook, Research Area Director for Radio-research within Ericsson Research commented: ‘The first version of 5G is now materializing - a result of quite some years of research, standardization and trials of both Mobile Broadband use cases and proof-points from several vertical segments. Guided by insights from these activities, the first 5G release can support a wide range of industrial use-cases besides MBB. The future-proof design of NR and expected continuous evolution of 5G mean that we will be able to meet also future requirements emerging from use cases no one has thought of yet.’

 

With the launch of the 3GPP 5G NR specifications, we’re on the cusp of unleashing the full potential of 5G and the benefits this technology will deliver in terms of improving productivity, opening up new business models and opportunities, improve humans’ existence and increase the GDP in countries that choose to invest in its technology. I left the conference feeling invigorated – whatever is driving 5G development and deployment, it has a true chance to be a force for good and that is an exciting prospect to be a part of!

Here’s a question for you: How do you define mission-critical in the Internet of Things (IoT)? Some might say that a mission-critical IoT device or application is one with the potential to impact life or death. A power plant, water infrastructure, refinery, or medical devices might fall into this category. If a city’s smart water system shuts down, the health of its citizens is impacted. If a medical monitoring device fails to deliver a critical alert to a healthcare professional, a patient may die. You get the idea.

 

I personally think that the mission-critical IoT is much broader than this definition. In fact, I would argue that many IoT devices and applications once thought of as luxuries have today become “mission critical.” They are an integral part of our everyday life and we depend on them to work right, every time—even if their failure to do so doesn’t necessarily have a dire consequence.  

 

GPS is a prime example. It’s a technology virtually everyone is familiar with these days. But that wasn’t always the case. When GPS first entered the marketplace, it was in the form of large bulky devices that could be carried with you as you walked, or attached to your car window to guide you to your destination. The devices were expensive and required constant software updates. They were a luxury item. 

 

Over time, those devices got smaller and more accurate. More importantly, GPS made its way into two very popular IoT devices: smart phones and smart watches. Today, that combination enables some very mission-critical applications.

When driving, GPS-based mapping applications in smart phones guide us safely to our destinations. If the wireless connectivity in the smart phone cuts out or the application fails to work as expected, accidentally sending you in the wrong direction or into an unsafe area, you could easily find yourself driving into a ditch or potentially, the victim of a crime. 

 

In smart watches, GPS helps to keep children safe via a geofence that establishes a virtual perimeter or barrier around a physical geographical area. When a child wearing a smart watch goes beyond that perimeter, a notification is sent to their parent or guardian. 

 

These examples underscore the ongoing transition of more luxury, or consumer-based IoT devices into the mission-critical arena. It's a trend that will only increase as the IoT proliferates. It's happening today. In the process, it is opening many new opportunities for designers and IoT device manufacturers traditionally developing products for the consumer market, and for network operators and service providers as well. By taking a broader view of the needs of individuals and society around them, they can begin to identify innovative ways to tweak their products for use in the mission-critical IoT. A prime example might be a wearable device adapted to alert patients and healthcare professionals to health irregularities and to predict potentially significant incidents before they have the chance occur. 

 

For IoT device designers, manufacturers, network operators, and service providers looking to expand into the mission critical IoT, there will undoubtedly be challenges ahead. Requirements will need to be understood and best design practices adopted. The right design, test and monitoring solutions will also need to be selected. 

 

Two other important factors that will need to be taken into consideration are reliability and security. IoT devices that people count on to work right, simply can’t fail. That means IoT devices, wireless communications and networks have to be ultra-reliable. And because those devices capture immense amounts of data, all of which is vulnerable to attack, security is imperative, especially in the medical arena.

 

Security is crucial for IoT medical devices.

 

Security involves not just the IoT endpoint device, but the networks on which the data is transmitted. Any potential vulnerability in the chain could be catastrophic. A hacker gaining access to a smart infusion pump, for example, might change the timing or amount of medication dispensed to a patient, causing a life-threatening emergency. Think it can’t happen? In 2015, the FDA issued an alert, warning of just that possibility with the Symbiq Infusion pump.

 

Appropriate visibility and test solutions designed to validate the security posture of networks can go a long way in ensuring any potential vulnerabilities are identified and dealt with quickly. Likewise, solutions that allow the performance of IoT devices to be evaluated under real-world conditions can be quite valuable for identifying reliability issues before they can result in costly product redesigns or even a recall.

 

While many unknowns lie ahead for designers, manufacturers, network operators, and service providers on the road to the mission-critical IoT, it’s clear that the size of the opportunity in this rapidly evolving space will only continue to grow. For many, that makes it a journey well worth undertaking. If you’re interested in finding out more about the evolving mission-critical IoT and what you can due to leverage its growing opportunities, check out the white paper Key Technologies Needed to Advance Mission-Critical IoT at the Keysight Mission-Critical IoT webpage.

 iot# iotdevicetest iotwireless missioncritical iiot iotdevices iotdevicesecurity

The millimeter-wave (mmwave) frequencies of 30-300 GHz offer incredible opportunities for innovation. As we reach the bandwidth limits of lower frequencies, engineers are looking towards mmwave frequencies to help accommodate the explosive growth of wireless devices. One of the goals of 5G is to decentralize mobile phone transmission from big cellular towers to numerous small hot spots, also known as cells. A single cellular tower can only support so many devices, so increasing the number of cells will relieve mobile traffic as more devices demand bandwidth. Millimeter-wave signals attenuate quickly in the air so multiple cells can use the same frequency without interfering with each other.

 

This is just one example of how the industry is moving towards higher frequencies. Gigabit WiFi devices, automotive radar, and secure military and aerospace radar will all depend on mmwave frequencies. Vector network analyzers typically have maximum frequencies of only 67 GHz, so frequency extenders are required to test at these higher mmwave frequencies.

 

Frequency extender modules allow vector network analyzers to characterize devices at frequencies up into the hundreds of GHz. But there are more considerations than just your instrument’s frequency capability when making mmwave measurements. How do you validate that your measurements are accurate and reliable? You need to minimize uncertainty by:

  • Minimizing cable loss
  • Stabilizing temperature
  • DC biasing

 

Minimizing Cable Loss

Cable loss increases significantly with frequency, as seen in Figure 1. Even a very good cable will lose more than 1.1 dB over 8 cm at 110 GHz and higher, which has a strong impact on measurements. To put that in perspective, a standard 0.5 m cable could lose over 9 dB.

 

 

Figure 1: Cable Loss and Frequency

 

This amount of loss is unacceptable when testing low-power devices. External frequency extenders can be placed right next to the DUT to minimize cable loss at high frequencies. This means that you only need to account for high frequency loss between the extenders and the DUT rather than the entire length between the DUT and the network analyzer.

 

 

Figure 2: Frequency extenders placed right next to the DUT, a wafer

 

Stabilizing Temperature

Frequency extension with poor temperature stability will bring down the performance of your entire system, regardless of how good your network analyzer is. When measuring wafers with thousands of devices, your measurement equipment will heat up. Smaller instruments like frequency extenders are more susceptible to ambient temperature changes than full-sized vector network analyzers. Higher temperatures agitate charge carriers and create thermal noise. We can see temperature’s direct contribution to the noise power in the following formula:

 

 

Where k is the Boltzmann constant in J/K, T is temperature in K, and B is the measurement bandwidth in Hz. In addition to the thermal noise, mechanical effects of heat can introduce measurement errors. Metal connectors expand and can shift when heated. This can lead to impedance mismatch and phase errors, especially at high frequencies. At a wavelength of 2.7 mm it only takes 1.35 mm of movement in the measurement plane for a 180° phase shift.

 

A measurement setup is only as strong as its weakest link, so frequency extenders ideally have temperature stability on par with high-end VNAs. Rugged, high quality connectors and convection cooling help mitigate thermal errors. Figure 3 shows how a measurement setup featuring frequency extenders with good temperature stability (blue) compares to a setup with more temperature drift (red) after 8 hours of measurement.

 

 

Figure 3: Drift Impact on Calibrated Measurements

 

DC Biasing

Many RF devices are active, meaning they require a DC bias to operate. Some frequency extenders have an internal bias tee which adds a DC signal to the mmwave test signal. The equivalent circuit in Figure 4 shows us how this works. The inductor blocks AC and the capacitor blocks DC so the inputs cannot interfere with each other. The output contains both a DC bias and an RF test signal.

Figure 4: Bias Tee Equivalent Circuit

 

Variations in bias current can lead to variations in measurements. The DC bias is susceptible to small leakage currents and the farther the bias is from the DUT, the more current will leak. Placing the bias tee within the frequency extender brings it as close to the DUT as possible, limiting current leaks to provide a consistent bias.

 

In Figure 5 we see several measurements of drain-source current vs gate-source voltage on a FET. Zooming in on the measurement reveals that there are actually different I-V curves. This is due to ground current leakage varying between measurements. Keeping the bias close to the DUT helps minimize errors like this.

 

 

Figure 5: FET I-V Curves

 

Summary

As we’ve seen, the frequency extension modules have a few ways of helping your mmwave vector network analysis.

The small size of the modules allows you to bring the measurement to the device. This reduces cable loss between your DUT and your instrument by minimizing the cable length the mmwave signals need to travel. The modules also minimize temperature drift errors with precision hardware and temperature regulation. Finally, the modules are able to bias active devices with a built-in bias tee.

 

The N5295AX03 frequency extender module uses all of these advantages to make accurate continuous sweeps from 900 Hz to 120 GHz. 

 

Get more mmWave resources!