It’s not easy to make a robocar see the road, read traffic signs, detect objects, classify them, sense its speed/trajectory and other cars, and — more important — localize itself on a map so it knows exactly where it needs to be.
For highly automated vehicles to track their surroundings, they must rely on a lot of sensors, including cameras, radar, ultrasound, GPS antennas and lidar devices that measure distance using pulses of light.
Each sensor comes with its own weaknesses and strengths.
Figuring out how to best fill in gaps inherent to sensors comes first. The second step, potentially more important, is to develop the best strategy to combine disparate streams of data without losing critical information. Each sensor spitting out data at its own frame rate is problematic enough. Sensor fusion gets even more complicated as some sensors offer raw data, while others provide their own answers in object data.
In 2017, we saw a host of advancements in perception technologies. “Perception is a major domain within the AV stack and there is so much innovation going on here,” said Phil Magney, founder and principal of VSI Labs.
Tech companies, tier ones and OEMs have been busy snatching up sensor technologies they don’t have or they can’t develop on their own. Meanwhile, a host of perception-sensor startups have emerged over the last two years alone, with many of them eyeing an autonomous car market that’s still embryonic.
Intel buys Mobileye
The biggest automotive industry deal in 2017 was Intel’s $15.3 billion acquisition of Mobileye.
Considering the clear lead Mobileye already holds in automotive vision for ADAS and autonomous cars, the Mobileye acquisition firmly gave Intel the pole position in the robocar race.
In particular, considering that vision is the only sensor technology indispensable in robocars, the deal was particularly significant. Intel says it’s combining Mobileye’s “computer vision, sensing, fusion, mapping, and driving policy” with Intel’s “open compute platforms.”
Describing cameras as the “must-have sensor,” Magney explained that the ability to capture images in high resolution makes cameras better at classifying objects. Cameras also add color. Their weakness? Cameras “give up depth over lidar,” Magney added.
Lidar: ‘the hottest place to be’
Among all sensor technologies, lidar is that market that featured the most business transactions in 2017. For example, Ford bought Princeton Lightwave, General Motors acquired lidar company Strobe Inc., and Continental got the lidar business from Advanced Scientific Concepts (ASC) last year, explained Akhilesh Kona, senior analyst for automotive electronics and semiconductors at IHS Markit.
VSI Labs’ Magney called lidar “still the hottest place to be.” This is partly because there are so many uses for lidar in automated driving, he explained. “Highly automated vehicles require a base map with localization assets and nothing displaces Lidar for this,” he said. “This is where the high-end units compete.”
The hot lidar market can also be traced to the emergence of new laser technology. According to IHS Markit’s Kona, the industry sees on the horizon a new laser emitter technology — above 1,400-nm wavelength. This new wavelength promises higher resolution and longer range in lidar. Princeton Lightwave, Continental (through its acquisition of ASC), and Luminar Technologies are all working on the new lasers, he added.
Meanwhile, suppliers continue to improve the durability, size, and cost of lidars by developing various beam-steering technologies, said Kona. They range from mechanical to MEMS and solid-state.
Mechanical lidars, such as the Velodyne 128 channel device, are perfect for mapping, because they can produce a 360-degree point cloud, according to Magney.
But for deployment in production cars, lidars based on a solid-state device — whether MEMs or OPA (optical phased array) — work well, said Magney. They can also produce a point cloud within their field of view.
Lower-cost flash devices are also emerging. Some, costing well under $100, “are designed to be proximity detectors,” said Magney. The drawback is limited resolution that makes it impossible to classify an object, he explained.
Next page: Millimeter wave radar