HweeYng

Programming for Advanced Driver-Assistance Systems — Nightmare or Challenge?

Blog Post created by HweeYng Employee on Jul 27, 2018

When you work with some very dedicated electronic design and test engineers, lunchtime conversations inevitably serve up different techie menus. Conversation topics could range from debugging their latest script, or on a less stressful day, regaling over a gorgeous video footage shot not with the latest drone, but on the wings of a wind-powered kite.

Recently, the conversation took on a somber tone as news of a spate of fatal road accidents in Singapore dominated the local headlines. In three separate accidents that happened days apart, five lives were lost, and a human judgment error was called into question. In one of the situations, a taxi made a right turn at a major junction, with an oncoming car slamming into its right. The impact killed the back seat taxi passenger.

 

Manoeuvring a major junction at peak hour -- would ADAS-equipped cars had made a difference? (Image source: The Straits Times)

The question was, would a driver assistance system like ADAS had been able to alert both drivers in such a complex scenario with multiple elements and proactively triggered preventive protocols to avert the accidents?

The tragic accidents in Singapore happened just weeks after an Uber self-driving car drove straight into a pedestrian walking her bicycle across the road in Tempe, Arizona, killing her. The incident raised concerns about the readiness of self-driving vehicles, and a host of questions on guidelines for testing these vehicles in real time.

“Autonomous vehicles make decisions based on what their sensors detect,” Bart Selman, an IT professor at Cornell University, was quoted by Scientific American as saying. “But if its sensors don’t detect anything, the vehicle won’t react.”

So, was it a sensor fault that led to the error or did the software fail to interpret the sensor input correctly?

“You need to distinguish the difference between people, cars, bushes, paper bags and anything else that could be out in the road environment,” Matthew Johnson-Roberson, an engineering professor at the University of Michigan, told Bloomberg. 

Multifaceted considerations

While companies may create the most sensitive radar sensors, the crux still relies on how powerful and comprehensive the software behind the devices works.

Would the onus be on the design and verification engineer to take on the mission critical backend task of simulating a plethora of scenarios to push ADAS to the limits? As my colleague remarked, “the weighty responsibility as a mission critical engineer has extended from the specialized fields of aerospace and defense to the autonomous vehicle industry”.

These responsibilities would include thorough mapping of how the hardware, software and the vehicle interact and respond in an environment that has a multitude of programming rules:

  • From cruise control to driver intervention, and eventually, fully autonomous driving.
  • Vehicular responses in “right of way” situations at intersections, which differ in different countries.
  • Decision tree when driving pass double-parked vehicles, with potential dangerous blind spots
  • Should the system respond using strict adherence to traffic laws as parameters, knowing drivers often break the rules?
  • How would the car respond to the proverbial, highly subjective parameter of human judgment error?
  • Nature’s random incidents — a flock of pigeons may take off upon detection of an approaching car, all except one bird — how would the sensor relay this dynamic information and trigger split-second responses involving the powertrain, steering and braking systems?

Automotive radar sensors play a vital role in ADAS to avert road accidents.

Detecting pedestrians in real-life urbanscapes using automotive radar is challenging, compromised by low signal-to-noise clutter ratio (SCR) caused by surrounding cars, road asphalt, and buildings. However, various ADAS technologies are deployed in today’s autonomous vehicles to help them “see” on the road and assist drivers in averting accidents.

While older automotive radar technology probably could not tell a deer from a large carton that fell off a truck, modern automotive radar with micro-Doppler provides refinement of object detection. It can also detect pedestrians from moving vehicles because when pedestrians are walking or running, they naturally move arms, elbows, hands, knees, toes and other parts of their bodies, which generate different micro-Doppler shifts from their torso (see Figure 1).

Figure 1. It is critical to understand the detailed body parts would have different relative movements shown in multiple Doppler time shifts.

Figure 1. It is critical to understand that detailed body parts would have different relative movements shown in multiple Doppler time shifts.

 

In the past, some car companies used infrared technology to detect living beings, from man to moose. However, this technology only works when the car’s night vision is activated. As early as 2017, Volvo rolled out its large animal detection sensors using a combination of radar and camera to avoid large animal fatalities. The U.S. Department of Transportation estimates that between 1 and 2 million cars collide with large animals each year, with 10% or 26,000 of those collisions causing an injury to the driver — about 200 of which are fatal.

Automotive radar can detect animals before the driver notices them on the road.

Nightmare or Challenge?

Whether the next program or simulation test is to save a man or a moose, automotive engineers know they have joined the league of mission critical engineers to ensure each of their programs are zero error, whether under a multitude of programming simulations on their bench or when the next autonomous vehicle rolls off the road.

“It can be quite a nightmare, if moving into Level 4 or Level 5 of autonomous driving, thinking a program I was involved in had followed a decision tree to avert hitting a young lady, and somehow in the swerve, killed my own mother,” remarked a co-worker.

A heavy note indeed to end our lunchtime musings, but one worthy of consideration not just at the next software simulation meeting, but one which even leaders at the World Economic Forum have put on their table. But that perhaps, is fodder for another lunchtime discourse.

 

You may be interested in: Automotive Manufacturer Develops Mission-Critical Radar Sensors

Keysight’s W1908EP SystemVue Automotive Radar Library

World Economic Forum: Why we have the ethics of self-driving cars all wrong

 

About Keysight Automotive & Energy Solutions (AES)

 

The automotive and energy industries are synergistically paving the way for a future built on digital transformation and the electrification of everything. Engineers like you are pioneering this electrical revolution through the development of smarter, safer, and more efficient technologies; whether driving the latest advancements in communications for the Connected Car, creating new power electronics designs to facilitate renewable energy integration and vehicle electrification, or pressing forth in the quest to economize next-generation battery energy storage systems (BESS).

 

Keysight AES is committed to addressing the biggest design and test challenges faced by engineers in the automotive and energy industries. With fully-integrated solutions combining leading-edge hardware and ultra-sophisticated software, Keysight AES removes technological barriers and streamlines innovation, helping to bring your breakthroughs to market faster, cheaper, and easier. These range from powerful solar array simulator platforms for rapid optimization of modern photovoltaic (PV) system designs, to highly-efficient regenerative hybrid-electric/electric vehicle (HEV/EV) test systems for putting onboard power converters through their paces, and revolutionary battery performance characterization and high-volume Li-Ion cell production solutions for unprecedented savings of time and space; not forgetting advanced power circuit simulator tools to ensure seamless integration of new, state-of-the-art wide bandgap (WBG) devices.

 

Follow the Keysight AES blog to stay tuned-in on the latest insights from thought leaders in design and test for automotive and energy applications.

Outcomes