Skip navigation
All Places > Keysight Blogs > Next Generation Wireless Communications > Blog > 2018 > July
2018

Ask any R&D engineer these days if they are working on devices with wireless connectivity and they will likely say yes. Ask them again if they consider themselves Internet of Things (IoT) engineers and it’s a safe bet the answer will be no.

It’s easy to understand the disconnect. R&D engineers come out of college and enter the workplace ready to design widgets of all shapes, sizes and functionality. Until more recently, those widgets likely didn’t have to communicate wirelessly with other things and that was just fine, because for most, IoT wasn’t on the educational menu. In fact, it’s only recently that universities have begun incorporating IoT concepts and design practices into their curriculum.

Here’s where things get interesting. Everything is becoming wirelessly connected these days. By 2020 alone, there will be an estimated 50 billion connected devices around the world. That means one thing: It’s no longer a matter of if R&D engineers will have to work on IoT devices, but when.

What happens when these engineers, who have previously developed countless widgets, none of which ever had to communicate wirelessly with other things, finds themselves designing a widget and it has a radio. It needs to send and receive information and must work in an environment with lots of other widgets sending and receiving information at the same time, and potentially interfering with its communication. What once was a relatively straightforward design project, suddenly becomes a complicated mess.

So, while IoT may seem to some engineers like nothing more than an overhyped buzzword, it will soon be impacting every aspect of their design work—if it doesn’t already. And let’s be clear. There is a big difference between modifying a widget to communicate wirelessly and designing an IoT device to succeed in the real world.

Designers working on an IoT device design

Creating an IoT device to stand the test of time and onslaught from competing products is quite tricky. To make devices “smart,” advanced technologies must be utilized and that introduces new design and test challenges. The device may have to work unattended for long periods of time and in harsh environments, making a long battery life and reliability absolutely essential. It may have to work in networks with lots of other devices and sources of interference, necessitating extensive co-existence testing. It must comply with industry standards and regulations. And, it must be secure. How the device will be utilized in the real-world also has to be considered, so it’s design can be properly optimized.

Some of these concerns are commonplace for today’s R&D engineers, but many are not. Succeeding in this environment will require engineers to look beyond their job titles and come face to face with what it really means to design for the IoT.

For some, learning new IoT design skills will be in order. For others, it will simply be a matter of finetuning their existing skillset. Either way, make no mistake, designing IoT devices is a difficult task. It would be a serious misstep for any engineer to think otherwise and to assume that because they don’t consider themselves IoT engineers, they don’t have to deal with IoT issues. Nothing could be farther from the truth. Engineers will have to work hard to create designs to succeed in the IoT and that hard work starts with building a strong IoT skillset that’s supported by the right tools.

For more information on the designing for the IoT and understanding the challenges you face, go to www.keysight.com/find/iot and www.keysight.com/find/missioncriticaliot. And to help you down the right path to improving your IoT design skills, check out these free webcasts: Maximizing IoT Device Battery Life, Overcoming IoT Device Wireless Design and Test Challenges, and Analyzing IoT Device Power Integrity. iotinternetofthings iottest endtoendtesting iotdevice batterylife devicesecurity

 A significant milestone in the final sprint toward 5G commercialization

 

Last month the 3GPP (Third Generation Partnership Project) approved 5G New Radio (NR) Release-15 standalone (SA) specification, paving the way for commercial introductions later this year.

 

Leading telecom operators, internet companies, and chipset, device, and equipment vendors contributed to this historic moment in 5G.  Release-15 introduces a new end-to-end network architecture enabling many new high throughput and low latency use cases that opens the door to new business models, and an era of everything connected. 

 

Does this mean 5G specifications are done? Not quite.  3GPP is working on additions to the current release and has already started work on phase II with Release-16 expected in 2019. The specifications add new capabilities and technologies that require leading-edge innovations in new designs. I am personally excited about the technology growth and advancements we will see in the next three to five years as the specifications evolve through Release-16 and beyond.

3GPP Release 15 photo by Keysight representative Moray Rumney.

3GPP Release-15 photo, courtesy of Keysight representative Moray Rumney.

 

The December 2017 announcement enabled operators to deploy 5G non-standalone mode (NSA) using existing LTE evolved packet core (EPC) for the control plane.  With the June 2018 Release-15, now 5G can operate in standalone (SA) mode using the completely new 5G RAN (radio access network) and core. This is a significant step forward because now 5G can support the many different use cases envisioned by IMT-2020, including high throughput on the mobile internet and low latency applications.  Enabling technologies such as flexible numerologies, massive MIMO and beam steering, and the use of mmWave spectrum dramatically changes designs of devices and network infrastructure.

 

Operators and network equipment manufacturers are already conducting 5G trials and plan for initial mmWave fixed wireless introductions in select cities later this year, and mobile smartphone services in 2019. Keysight has been working closely with these companies. Satish Dhanasekaran, senior vice president of Keysight Technologies, and president of the Communications Solutions Group (CSG) shared his perspective on the achievement: “We are excited to enable the industry at a threshold of 5G acceleration and commercialization. The completion of the standalone (SA) 5G new radio (NR) specification marks a distinct milestone and offers a playbook for a connected ecosystem to move forward, in making 5G a reality and unlocking huge potential for society. Keysight is engaged with market leaders, contributing to the 3GPP standardization development, and providing scalable 5G test and measurement solutions from L1 all the way to L7.”

 

What’s next? After a brief self-congratulatory pause, the focus quickly turns to work on a late drop of Release-15 planned for later in 2018 to fix some known issues. There is also the lengthy “to-do” list for Release-16 to address some critical challenges including reducing device power consumption, addressing network interference management, enhancing reliability in IoT use cases, and furthering the integration of licensed, unlicensed and shared spectrum into 5G deployments. There’s a long road ahead of us for 5G, and it’s not going to be an easy one. To stay up-to-date on 5G New Radio, check back on this blog or go to the 5G NR webpage to access white papers, webinars, and other informative content on 5G NR.

 

You can get more information about Keysight Technologies 5G solutions here.

Network of lights

With 5G NR release 15 in June 2018, how soon can you expect to see 5G devices operating at mmWave frequencies?  The current buzz is sooner than you expect. 

 

At the recent IMS 5G Summit, I learned about some timelines. Initial mmWave releases are expected to be point-to-point, or point-to-multi-point, but not fully 5G NR compliant. But soon after, in the first half of 2019, operators and equipment makers are planning to introduce 5G devices with mmWave radios in select cities. This poses some pretty significant challenges for designers to produce a mmWave mobile device that meets expected quality of service while traveling through the network. 

 

mmWave isn’t new for wireless communications, but it is new for cellular communications. 5G NR specifies frequency up to 52.6 GHz and new operating bands that open up almost 10 GHz of new spectrum.

 

  • Frequency Range 1 (FR1): 400 MHz to 6 GHz adds 1.5 GHz of new spectrum in frequency bands: 3.3-4.2 GHz, 3.3–3.8 GHz, 4.4–5 GHz.

 

  • Frequency Range 2 (FR2):25 to 52.6 GHz adds 8.25 GHz of new spectrum in frequency bands: 26.5–29.5 GHz, 24.25–27.5 GHz, 37–40 GHz. Initial mmWave targets are 28 GHz and 39 GHz in Japan and the US.

 

mmWave, where there is greater modulation bandwidth, is essential to meeting the extreme data throughput envisioned in 5G mobile broadband. However, establishing a mmWave communication link and tracking a mmWave device through the mobile network will be a challenge. mmWave signals just don’t behave the same as signals under 6 GHz. 

 

5G NR will use technologies like beam steering and new initial access procedures to enable a mmWave communication link, but transmitters and receivers must also be able to produce and demodulate high-quality signals in the device and base station. IQ impairments, phase noise, linear compression (AM to AM) and nonlinear compression (AM to PM), and frequency error can all cause distortion in the modulated signal. Phase noise is one of the most challenging factors in mmWave OFDM systems. Too much phase noise in designs can result in each subcarrier interfering with other subcarriers, leading to impaired demodulation performance.  These issues are even more problematic at mmWave frequencies with wider bandwidths.

 

Evaluating a signal’s modulation properties provides one of the most useful indicators of signal quality. Viewing the IQ constellation helps to determine and troubleshoot distortion errors. A key indicator of a signal’s modulation quality is a numeric error vector magnitude (EVM) measurement that provides an overall indication of waveform distortion. As modulation density increases, so too does the requirement for better EVM.  Shown here is the 3GPP (Third Generation Partnership Project) TS 38.101-1 EVM requirement for 5G UE (user equipment).

 

 

Modulation scheme for PDSCH

Required EVM

QPSK

17.5%

16QAM

12.5%

64QAM

8%

256QAM

3.5%

 

 

Measurements of the overall spectrum are also used to validate the signal’s RF performance.  

 

Test solutions don’t just migrate from sub 6 GHz.  The test equipment needs to operate at the higher mmWave frequencies with wider modulation bandwidths and have better specifications than the device under test (DUT). When designing test solutions, you now need to be even more concerned about issues like adapters and cables, switching, over-the-air test, and system-level calibration. The measurement system needs to perform better than the DUT’s design goals, and a proper system level calibration helps to eliminate uncertainties due to test fixtures and is valuable for very wide bandwidth signals.

 

To find out more about overcoming the challenges of mmWave device design and test, check out this white paper series that looks at many of the challenges you can expect with 5G NR including, the new flexible numerology, mmWave design considerations, MIMO and beamforming, and over-the-air testing challenges at www.keysight.com/find/5GNR.