SAN FRANCISCO The processing power of new-generation microcontrollers may soon give automobiles the anthropomorphic appeal of HAL, the talking computer depicted in, "2001: A Space Odyssey."
In addition to recognizing road signs and pedestrians on a country road at night, dishing out entertainment and navigational aids, cars of the near future will be able to recognize spoken language commands and talk with your kids. This was the consensus of automotive experts assembled at the International Solid State Circuits Conference here this week.
Microprocessors accounted for the lion's share of the $25 billion spent on automotive semiconductors in 2006, said Yasusi Sinojima, general manager of the electronics engineering division of Toyota Motor Corp. Electronic control units (ECUs) are currently used for engine control, emission monitoring and control, safety and passenger compartment control. Increasingly, the telematics compartment is required to not only provide docking capability for iPods but also to serve as the launching pad for digital audio and TV.
In-dash navigation systems not only calculate the shortest routes, but also display local landmarks in 3D graphics. Inevitably, user interfaces will include natural language communication between the driver, passengers and the telematics console. But this elevates processing requirements for onboard controllers.
There was little argument among the panelists that this would happen; the only question is when, and how many processors it will take.
Sinojima said new safety concerns are elevating the MIPS [millions of instructions per second] rate for automotive processors. In China, for example, where automobile sales are soaring, traffic deaths are increasing exponentially. Sinojima cited a requirement for the automobile to recognize objects in front of it, and aid (or supersede) the judgment of the driver in stopping.
Infrared sensors can now detect the presence of pedestrians on a darkened road as far as 150 meters ahead. In new automotive systems, driver night vision could be enhanced by projecting the image of the distant pedestrian onto the windshield. A redundant visual image is provided by a CMOS image sensor which detects objects 60 meters, and triggers a fast response loopautomatically tightening the drivers' seat belt in anticipation and applying the brakes.
What Toyota calls its "judgment microprocessor" is a 32-bit, 80-MHz device. Its 2005 luxury sedan had 70 to 80 procesors, Sinojima said. To provide the required number of MIPs, a future judgment processor will likely be a 64-bit, 400-MHz device by 2010.
Toru Baji of Renesas described "microbrains" installed in future cars for engine control and to communicate with passengers. He predicted a typical sedan will likely have 45 ECUs by 2013, though the Lexus 460 currently has over 100 devices.
Current versions of Renesas' SH-Navi telematics and audio console controller run at 400 MHz, and provide 2D and 3D navigation. The SH-Navi may have enough crude speech recognition capability to support command-and-control applications. Newer versions of the device will feature 600- and 800-MHz clocks, dual and quads cores and may also have enough processing power for natural speech recognition.
Visual pattern recognition requires an image processor with more than a 10 giga-operations per second capability, said panelist Augusto de Oliveira, vice president and senior fellow of NXP's computing architectures group. Video cameras can see 60 meters ahead, he said, aiding night vision, pedestrian detection, traffic sign recognition, road condition monitoring and blind-spot coverage. Multiple cameras in each car are possible, he forecasted, totaling 90 million in use by 2020.
Stephan Ohr is research director for analog semiconductors at Gartner Dataquest Research.