TI’s ADAS SoCs, loaded with programmability and processing power enabled by the combination of CPUs, DSPs, and vision accelerators, represent the “dynamic nature of ADAS,” noted Williams. While vision algorithms are constantly improving, RFQs on ADAS from OEMs and tier-one are also rapidly changing.
Not done yet
Jeff Bier, president of digital signal processing at the consulting firm BDTI and founder of the Embedded Vision Alliance, acknowledged that there are few chips available today specifically designed for vision applications, let alone in automotive applications.
Vision processing for ADAS remains as a very hard problem to solve, said Bier, despite the number of man-years spent developing a host of embedded vision algorithms. Describing his experience of driving a car equipped with an after-market ADAS system, he cautioned, “This [ADAS] is not done.”
The after-market ADAS system he purchased, for example, “runs rock solid on highways,” said Bier. But once off the highway where your car sees random objects -- utility poles, pedestrians, and road signs -- under a variety of weather conditions, “it tends to sound off shrill false alarms,” he noted. “There is just an infinite variety of conditions ADAS needs to deal with, and they are hardly done.”
As TI’s ADAS SoCs illustrate, vision-processing automotive applications demand “a lot of horse power,” said Bier. While the variety of cores integrated inside TI’s SoCs are critical to making the chip high performance at the same time as keeping its power low, the downside is programming complexity, said Bier.
TI claims that its ADAS-related Vision Software Design Kit (SDK) “enables customers to quickly and easily integrate multiple EVEs and DSP algorithms and then benchmark and partition them across multiple processing elements.”
Bier, however, suspects that things won’t be as easy as TI makes it sound. For example, in pedestrian detection or lane marking applications in ADAS, vision algorithms such as contour and edge detection are used. Each algorithm requires a chain of steps, sometimes as many as 15 to 20 distinct steps, when running kernels, explained Bier. Programmers must figure out which parts of these steps need to run where -- on DSPs, CPUs, or EVE cores. “How to partition it” in the heterogeneous system architecture could become a big problem, he observed.
The upshot of ADAS is that if done right, with improved technology, it saves people’s lives. “That would be really fantastic,” said Bier. TI cited stats such as 1.2 million traffic deaths globally in 2012. Further, 93 percent of traffic accidents in the United States are estimated to be due to human error.
Beyond that, he noted, “Once vision systems are placed in a car, all kinds of additional applications will be developed” that can address not only safety but also ride comfort. Such examples include relying on a forward-scanning camera to anticipate road imperfections (demonstrated by Mercedes-Benz in its “magic body control” feature), or monitoring if a child is left in a car (and if so, the car would sound off alarm.)