LAS VEGAS — Nvidia came to the Consumer Electronics Show to impress the world that a company that began life 15 years ago as a graphics chip vendor has now grown into the world leader in “autonomous machine processors.”
During Sunday’s keynote sales pitch, Nvidia’s CEO Jensen Huang boasted that Nvidia today, backed with its GPU technology, is well positioned to go after the fastest growing technology segments including gaming ($100 billion), Artificial Intelligence ($3 trillion) and transportation industry ($10 trillion).
Nvidia’s CEO Jensen Huang on stage Sunday at CES (photo: EE Times)
Particularly, Huang sees his company’s latest Xavier processor, which came back from a foundry two weeks ago, as Nvidia’s ace in the hole. He calls it the “AI supercomputer for future autonomous transportation.”
Nvidia is now styling itself as the best friend, or more important, the crucial enabler for the emerging streetscape of robo-taxis. IHS Markit forecasts that fully automated vehicles for the fleet business will appear as early as 2019.
The biggest news unveiled Sunday at Nvidia’s press conference was a deal with Uber to power its ride-hailing fleets.
Nvidia CEO talks of Uber deal (photo: EE Times)
A consensus is coalescing around a vision of robo-taxis as the first market where fully automated vehicles will be applied. If this proves true, Nvidia’s agreement with Uber puts the chip company at the forefront of the automated vehicle race as a supplier of AV brains.
Danny Shapiro, Nvidia's senior director of automotive, told the media that Xavier processors with more than 9 billion transistors, originally announced in the fall of 2016, are already up and running. “We believe Xavier is at least two years ahead” of other processors offered by competitors, he said.
The backbone of Nvidia’s AV strategy isn’t any one specific SoC, however. It is, rather, the company’s Drive PX platform. It's designed for the development of autonomous vehicles. The platform combines deep learning, sensor fusion and surround vision. The AV software stack built on the Drive PX platform can understand in real-time what's happening around the vehicle, precisely locating itself on an HD map and planning a safe path forward.
This vertically integrated, “sensor-agnostic” AV strategy presented by Nvidia’s Drive PX platform is ideal for companies targeting Level 4/Level robo-taxis today, observed Phil Magney, founder and principal of VSI Labs. Robo-taxi companies “will rely on an expensive fleet made of cars loaded with all the assets including lidar-based HD maps,” he said.
Indeed, Uber takes advantage of Nvidia's flexible Drive PX platform which allows a number of sensory data to be fused. Uber is using a host of different sensor technologies shown in above picture. The company, reportedly, is using some neural nets to help with perception, thus turning sensor data into object data. Some of this is done by "fusing" multiple sensors, some not. The company is said to use other neural nets on later stages, like predicting what the car should do next.
Next page: Growing ecosystem