JungikSuh

Enhancing the Senses of Autonomous Vehicles

Blog Post created by JungikSuh Employee on Jun 8, 2017

In my previous post, I shared the exciting developments around autonomous driving and its potential benefits to save more lives, by removing human errors which have been blamed for more than 90% of fatal driving accidents.

 

Now let’s switch gear to more detailed discussions of the enabling technologies of autonomous driving.

 

In this post, we will focus on sensing technologies – these are among the most important enabling technologies of autonomous driving. We will look at three major sensing technologies – namely radar (Radio Detection and Ranging), LiDAR (Light Detection and Ranging), and camera.  This is an exciting area with developers working hard to finetune sensitivity and response-abilities of all sorts of automotive sensors.

 

From the technology point of view, radar technology has been widely used mostly in the aerospace and defense industry before and during World War II, so it’s not brand new technology. Even for automotive radar research, the first projects happened in the 1970’s – that’s a long time ago.

 

However, widespread application of radar technologies in the automotive consumer market is fast growing these days. It started with high-end passenger cars, but these days, more and more trucks and low-end passenger cars have automotive radar sensors mostly for better safety as well as more convenience (eg: Stop & Go at the traffic jam).

 

Automotive radar detects distance (range) and motion, including velocity and angle. Radar systems have several benefits. They work in almost every condition, use reflected radio waves to detect the obstacle behind other obstacles, and have fewer signal processing requirements. On the other hand, there are some limitations. For instance, radar cannot interpret what the obstacles are (eg: human, dog, another car, paper box, or even if it’s a huge heavy refrigerator dropped on the highway). It simply detects the presence of an object without providing information what it is. This is a key reason why radar may not be able to provide enough data characterization to enable a full autonomous driving system due to limited information about the detected obstacles. 

 

LiDAR provides much more intelligent 3D mapping using laser light. LiDAR scans 360 degrees of the environment around the autonomous driving car with more than a 100 meter-range. Some LiDAR systems provide as many as 64 channels, and over a million points of scans per seconds, enabling the car to have enough information of surrounding situations so it can execute a decision on how to react to the environment.

 

The cons of LiDAR-based sensing technology are that the sensors are still very expensive although some leading LiDAR companies are announcing more economic versions of LiDAR sensors. It also generates huge amounts of data, which requires tremendous signal processing power and data management sub-systems.

 

Camera technology aims for recognition and image classification. Compared to radar and LiDAR, it is a cheaper sensing technology, although signal processing cost is still not cheap. It can provide vision-based imaging data, hence with some level of signal processing, the camera can read traffic signs such as limited maximum speed, school zone notice, and more. The limitation of camera-based sensing technology is that cameras are affected by weather and other environments. For example, there was a widely reported tragic accident whereby the vehicle was using camera-based semi-autonomous driving when the camera sensor failed to recognize a white truck crossing the road due to the reflections from the truck.

 

 

As each sensing technology has pros and cons, the industry currently can’t just depend on a single sensing technology for autonomous driving research. Most leading players of the autonomous driving industry use all three, or at least two of the technologies discussed here, to make sure their autonomous driving system gets enough data from all around the vehicle.

 

However, radar technology plays more roles in Advanced Driver Assistance System (ADAS), which is already currently a tangible way to save more lives by using technology to mitigate human driver errors. It is a good bridging platform before fully autonomous driving cars come into real-life. Applications of ADAS include radar-based emergency braking systems, forward collision warning, blind spot detection, rear collision warning system, adaptive cruise control, and many more applications that help enable safer driving.  

 

Currently, four key areas of radar technology development are driving numerous automotive applications:

  1. Higher frequencies, including 24, 77 and 79 GHz
  2. Wider bandwidths of 1, 2 and 4 GHz
  3. Accurate power control, which helps ensure the sensors transmit and receive radar signals to the objects with minimized interference from other vehicles
  4. In-vehicle Ethernet enabling fast, accurate, and reliable in-vehicle communication of the large amounts of data captured by the radar sensor.

 

The trend is also for automotive radar technologies to use higher frequencies with wider bandwidths. For example, more and more short range radars will use 79 GHz instead of 24 GHz. This is because of better resolution from wider bandwidth, enabling objects to be more clearly detected and differentiated. Higher frequency at 79 GHz can guarantee wider bandwidth and encounter fewer spectrum occupancy or regulatory issues as agencies worldwide continue to work on harmonizing frequency allocation for vehicular radars in the frequency range 77 GHz to 81 GHz.  

 

As per the example given in my previous post and illustrated below, signals with better resolution from 4 GHz bandwidth (on the right side) clearly differentiated two obstacles compared to the left one using only 1 GHz. It is critical for the radar to detect both obstacles to avoid any risky situation. In addition, higher frequency makes smaller and lighter sensors possible, which helps car designers achieve more compact designs and better gas mileage.     

 

Engineers working on autonomous driving and sensing technologies contribute to making our roads much safer while saving more lives. They are real super heroes working behind the scene to help save more than 90% of the 1.2 million people killed in car accidents every year. These engineers save more people than Superman, Batman, or Wonder Woman, and their ‘weapons of choice” are accurate test and measurement solutions – critical tools to make sure their life-saving projects work just perfect. 

 

automotive application

 

Follow theAutomotive and Energy Solutions blog today and gain insight from our solution experts as they share their experiences, opinions and measurement tips on Connected Car, Automotive Radar, electric vehicle and more.

 

About Keysight’s Automotive & Energy Solutions

 

The world is seeing a rapid convergence of automotive and energy technologies for safer, better energy efficient, and more convenient driving. Engineers like you are the drivers, designing the latest automotive Ethernet, radar, 802.11p and 4G/5G applications in the Connected Car, or pressing forth in the quest for greater energy efficiency in tapping solar power, conversion and storage for vehicle electrification including Electric Vehicle (EV) and Hybrid EV.

 

Keysight is committed to help bring your vehicle electrification innovations to market faster with our design and test solutions in this energy efficiency ecosystem. These range from powerful solar array simulator solutions to maximize your PV efficiency, to EV test for vehicle electrification innovations, and time and space-saving revolutionary Li-Ion cell and battery performance test solutions; not forgetting power circuit simulator tools to ensure the power devices behind all these innovations work seamlessly.

 

Stay tuned for the latest technology thought leadership in the automotive and energy efficiency space by following our blog.

Outcomes