Car OEMs and Tier One companies are learning to use sensor technologies for active safety systems that act on behalf of and in unison with the driver. What's the role of radar?
Of the many dramatic advances taking place in the automotive industry, the most far-reaching is the development of sensor technologies that allow cars to “see.”
Right now, the technology is transitioning from primary use in warning systems, which alert drivers to potential hazards, to active safety systems that act on behalf of and in unison with the driver. Ultimately, it will provide the sensory input required for intelligent, autonomous systems. In this article, we’ll look at the role of radar as a primary component in the suite of sensor technologies that are changing how we drive.
In the current transition, automotive OEMs and suppliers are learning to balance several available technologies to create a sensory cocoon that supports near-, mid-, and long-range presence detection and requirements.
The suite of available sensors and general applications (Figure 1) includes ultrasound, Lidar, Camera, and Radar. Each sensor technology has a complementary role in the overall system, from close-in vision and curb/object warning to broader location awareness. Radar’s role is critical for detecting objects across short-, mid-, and long-range, differentiating between multiple objects, and establishing the relative vectors of such objects under all weather conditions.
Radar (light green) is one of the suites of technologies driving the evolution of automotive sensing from warning to active safety systems. (Source: Infineon)
Over a two-decade development period that began in the 1970s, radar devices with operating frequencies from 16 GHz to 94 GHz were prototyped and tested for blind-spot detection, side detection/lane change assist, and adaptive cruise control (ACC).
The first road-ready systems for autos were deployed in the late 1990s for adaptive cruise control in the luxury segment. Continued miniaturization and cost reduction has brought the technology to a sharp upward inflection point (hockey stick curve) of adoption as a mass market technology for automotive applications. Infineon’s contributions to this accelerated adoption rate include the first commercial-scale production of automotive-qualified 77-GHz SiGe ICs in 2008 and production of radar ICs in AEC Q100-qualified eWLB packages beginning in 2012.
This advancement brought standard SMT assembly techniques to mm-wave electronics. Today, Infineon is a leading manufacturer of radar ICs in the two frequency bands, 24–26 GHz and 77–81 GHz, which fulfill the range and resolution requirements for the radar applications that comprise an auto’s “safety cocoon” (Figure 1).
The long familiar picture of radar using a modulated pulse and capturing its echo using rotating antennae was subsumed years ago by electronics-enabled modulation and beamforming techniques. While the operating principle is consistent, today’s Frequency Modulated Continuous Wave (FMCW) radar modules feature the small footprint, lightweight, and low power consumption (typically measured in low-single-digit watts) for use in both combustion and electric vehicles (Photos 1–2 below).
Photos 1–2: A 77-GHz radar module for recognizing objects at up to 250 meters includes three ICs (transmitter, receiver, and power amplifier) in eWLB packaging (top view) and a companion digital interface chip on a board smaller than a postcard. (Image: Autoliv)
A precisely defined transmitted radio wave is reflected by one or more targets, and this reflection is detected by an array of receiving antennae. Frequency modulation determines distance with high accuracy, while the continuous wave allows velocity observations based on the Doppler principal.
Angular resolution, historically a significant obstacle in the evolution of automotive radar, has been improved by broadening transmit and receive antennae arrays, basically allowing the radar to “see a more detailed world.” This concurrent increase in the signal processing load is addressed by use of more powerful microprocessors as part of the sensor module. Infineon AURIX family processors, for example, blend homogenous lockstep cores with a dedicated radar signal processing accelerator and large on-chip memory for radar image storage. This, plus a fitting power supply IC, is the solution being developed by Infineon, making up a chipset that allows state-of-the-art radar to be implemented using three core electric components.
In short, the combination of chip-level improvements in performance, efficiency, integration, and overall cost have made it possible for OEMs and Tier One suppliers to address challenges in angular resolution and differentiation of multiple objects by making it feasible to install multiple radar modules and antenna arrays to create a virtually 360-degree radar field of view.
Importantly, radar is a robust and unobtrusive sensing technology for automotive applications. Micro- and millimeter waves are not limited by moisture, particulates, or ambient light and even propagate through an insulator. Thus, they are well-suited for “always-available” safety applications and can be mounted invisibly within the car body. The potential for blockage by ice or water is readily addressed through physical countermeasures.
Additionally, radar is recognized to have better performance in poor weather conditions than such sensor technologies as LIDAR and cameras. As more radar systems are deployed, the industry is developing innovative mitigation strategies to manage interference. Advances in chip performance that allow for greater variance in modulation schemes, multiple frequency ramp measurements, and cooperative radar are part of the strategies used to address this remaining challenge.
To date, the majority of applications for radar sensors have been in the area of warning systems such as blind-spot detection, lane-change assist, and forward collision warning, which use radar as an extension of the driver’s eyes. They inform the driver of risk but do not initiate action. Active safety systems utilize sensor data to detect and classify environmental information and then react by actuating other vehicle systems, including brakes, engine, or steering.
Not surprisingly, the first radar-based system deployed in autos, ACC, was also the first to transition from passive to active functionality. Today, new functions like automatic emergency braking (AEB) are being added to a wide range of vehicle models. In 2016, 20 manufacturers committed to make AEB a standard feature on new cars no later than September 2022.
In the future, a network of car sensors will allow new ADAS features for you to drive safer, more comfortably, and will act invisibly in the background to make you a “better driver,” too. As illustrated in the table below, a long list of ADAS features are moving to active status, and the majority of listed functions are supported by radar.
Many ADAS features evolving from warning to active safety systems are supported by radar. (Source: Infineon)
Clearly, use of sensors to drive active safety and automated functions raise the safety design requirements for each part of the system. Best practices related to functional safety call for “specific techniques such as redundancy, diversity, and internal self-test to increase the product robustness against random and systematic failures.”
The concept of diversity is well-illustrated in areas such as lane-change assist, in which a combination of cameras and radar systems create a more resilient sensor stream. Pairing of cameras and radar sensors is appropriate in many applications like automatic emergency braking today (and automatic emergency steering in the future). The former augments the driver’s vision but has the same limitation as human sight while radar complements the human sensory suite. The more and higher input quality data that the Electronic Control Unit (ECU) gets, the better decisions can be made.
Design for diversity and redundancy also leads to the next major step in the adoption of sensor technology for active safety system to become ready for automated/autonomous functions. The required performance increase to understand and handle complex situations is enabled by the concept of sensor fusion, which involves real-time analysis of multiple sensor streams to accurately assess and react to the surrounding environment (see below: ADAS Board).
Integrated radar and forward-looking camera board. (Image: Autoliv)
While the processing task remains similar, the expectation is that the network topology and the amount of data exchanged will be significantly different from today’s decentralized approaches. In fact, it will be a “revolution under the hood,” and because there is no cookbook for fusion architectures today, every OEM is pursuing a different path to meeting the challenge. But it is clear what next-generation system needs mean to radar sensors: more channels/antennas, higher bandwidth, lower power consumption, and high-speed network interfaces.
It’s often said that changes in the automotive industry in this decade are greater than those seen since the era of mass manufacturing began a century ago. When you consider that robust presence detection makes possible the machine intelligence that is ultimately the basis for autonomous vehicles, it certainly appears that “seeing is believing.”
— The above contributed piece was co-authored by two engineers at Infineon Technologies North America: Jeff Kelley, product marketing, safety & security, and Tom LeMense, applications engineering, safety & security.