MADISON, Wis. -- Parsing out what exactly Google is accomplishing with its self-driving car project isn’t easy. The world awaits as dribs and drabs of information trickle occasionally from Google’s blog tease.
The latest leak, earlier this week, was a blog post by Chris Urmson, director of the Google Car project at Google. It offered a glimpse of how far Google’s self-driving car has come, as it takes driving lessons on the streets of Mountain View, Calif.
The video clip posted on Urmson’s blog also gives a sense of what the self-driving car’s machine vision is actually seeing as it tools along.
But what exactly have we learned? More important, what challenges are still ahead for Google (and the automotive industry as a whole) to move the self-driving car from an R&D project to a real product? We talked to a few industry analysts.
What computer vision sees
One thing that Urmason’s post makes very clear is Google’s ambition. It hopes to take its autonomous cars through every street and every city, in every terrain. Clearly, Google is eager to debunk the conventional assumption: Autonomous cars, most likely, will be deployed for driving on freeways.
Some experts in the industry have speculated that self-driving cars won’t be used for driving regular streets for a long time, since surface streets -- often plagued with unexpected events -- would be too tough for self-driving cars to handle, especially without vehicle-to-vehicle and/or vehicle-to-infrastructure help. The video clip shows otherwise.
Sure, Mountain View is no Mumbai. Street views in Mountain View are rather “antiseptic,” as described by Roger Lanctot, associate director, Strategy Analytics, when compared to any city in India or China, where pedestrian throngs mill constantly among all types of vehicles -- pushcarts, scooters, electric bicycles, you name it.
Still, most impressive about the video clip is how neatly Google Car’s computer vision organizes what it sees on streets into separate, independent boxes, Egil Juliussen, director research, Infotainment & ADAS, at IHS Automotive, explains to us. He says, “Notice how neatly all the cars, bikes and pedestrians are marked on the map the computers see based on all the sensors?” In his view, the car’s computer eye sees objects on the street in a much more orderly fashion than probably “90% of drivers see while driving.”
Strategy Analytics’ Lanctot observes, “You see the Google Car is going rather slowly -- but very cautiously.” It’s learning the subtleties in its path as it moves along. One important thing to remember, he says, is that the Google Car is “a self-contained vehicle.”
In other words, the car doesn’t depend on so-called data in the cloud to drive. The Google self-driving car is driving and “drawing a real-time map” on its own, by using the real-time data it has collected through its on-board sensors, including Velodyne’s lidar system on the rooftop,” Lanctot explains.