MADISON, Wis. -- Parsing out what exactly Google is accomplishing with its self-driving car project isn’t easy. The world awaits as dribs and drabs of information trickle occasionally from Google’s blog tease.
The latest leak, earlier this week, was a blog post by Chris Urmson, director of the Google Car project at Google. It offered a glimpse of how far Google’s self-driving car has come, as it takes driving lessons on the streets of Mountain View, Calif.
The video clip posted on Urmson’s blog also gives a sense of what the self-driving car’s machine vision is actually seeing as it tools along.
But what exactly have we learned? More important, what challenges are still ahead for Google (and the automotive industry as a whole) to move the self-driving car from an R&D project to a real product? We talked to a few industry analysts.
What computer vision sees
One thing that Urmason’s post makes very clear is Google’s ambition. It hopes to take its autonomous cars through every street and every city, in every terrain. Clearly, Google is eager to debunk the conventional assumption: Autonomous cars, most likely, will be deployed for driving on freeways.
Some experts in the industry have speculated that self-driving cars won’t be used for driving regular streets for a long time, since surface streets -- often plagued with unexpected events -- would be too tough for self-driving cars to handle, especially without vehicle-to-vehicle and/or vehicle-to-infrastructure help. The video clip shows otherwise.
Sure, Mountain View is no Mumbai. Street views in Mountain View are rather “antiseptic,” as described by Roger Lanctot, associate director, Strategy Analytics, when compared to any city in India or China, where pedestrian throngs mill constantly among all types of vehicles -- pushcarts, scooters, electric bicycles, you name it.
Still, most impressive about the video clip is how neatly Google Car’s computer vision organizes what it sees on streets into separate, independent boxes, Egil Juliussen, director research, Infotainment & ADAS, at IHS Automotive, explains to us. He says, “Notice how neatly all the cars, bikes and pedestrians are marked on the map the computers see based on all the sensors?” In his view, the car’s computer eye sees objects on the street in a much more orderly fashion than probably “90% of drivers see while driving.”
Strategy Analytics’ Lanctot observes, “You see the Google Car is going rather slowly -- but very cautiously.” It’s learning the subtleties in its path as it moves along. One important thing to remember, he says, is that the Google Car is “a self-contained vehicle.”
In other words, the car doesn’t depend on so-called data in the cloud to drive. The Google self-driving car is driving and “drawing a real-time map” on its own, by using the real-time data it has collected through its on-board sensors, including Velodyne’s lidar system on the rooftop,” Lanctot explains.
@DrQuine, this is such an astute observation. How a Google Car perceives and understands other drivers' behaviors (or pedestrians' behaviros) and psychology is something that hasn't been discussed and yet it is critical -- not just for the road safety but as you pointed out for the efficiency of traffic.
Having made the mistake of driving in New York City in Easter pedestrian traffic, I'm compelled to ask how does a Google car deal with a continuous flow of jaywalking pedestrians across a street despite a traffic light favoring the car? Human drivers typically advance slowly (and silently because of the no horn rule) signaling their intention to pass through the crowd of pedestrians until they have the light in their favor. This process is one that seems to demand an understanding of human nature and psychology while also being diligent in ensuring that nobody gets hurt or scared. Sometimes it even hinges on looking each other in the eye (the hand gestures and shouting that sometimes accompany this negotiation are probably not reasonably emulated by an autonomous vehicle). If the autonomous vehicle cannot address this scenario, it may block traffic for hours until the pedestrians have dissipated.
Good point about credibility...I would love to see an independent assesstment...also, waiting for 5G to come in 2020??? Onehas to admire a compnay willing to invest in technology that will be deployed in 5+ years out! Rare...Kris
I think the clip shows something quite impressive that builds hope for autonomous vehicles in the not too distant future. However, I think it is important to keep in mind that this clip is made by Googles marketing department and not an independent evaluation. The question is how many scary things this car does in a day of driving that Google is not showing.
I agree that the car does drive conservatively, and from what I saw people nonetheless gave it a lot of leeway. This is probably because they were gawking at it, but at some point people will get over the novelty and treat it like just another thing in their way. I suspect that the car will be much more accomodating of bad behavior than most drivers, which will encourage even more of that behavior. Potentially the sensors could be used to identify and report on that to the authorities, until then I can imagine cab drivers if no one else hassling them mercilessly.
In other words, the car doesn't depend on so-called data in the cloud to drive.
I know the proponents prefer to pretend that this is true, because it makes their job appear to be more self-contained, but it's not so. The car needed to know enough about the topology, streets, railroad crossings, stop signs, etc., to be able to navigate to the destination the user wants. Plus, the car needs GPS.
It's indeed impressive how good the on-board sensors have become, but just as your own eyes and ears are NOT sufficient to drive safely, especially in congested areas or where you might have blind corners and the like, the same applies to any high tech on-board sensors.
The one reaction I had, watching the video, was boy, would I HATE to get stuck behind a driver like that! The perfect formula for road rage. Of course, baby steps and all of that.
@Junko: the business / utility models for self-driving vehicles have a lot to be discussed with the proposed solution. This has to happen on many cityhall-type discussions in places where the deployment is proposed.
I suspect Google may launch this as a service between different locations it has in the Silicon Valley for the employees to move about. Citywide deployment would be next.
Level-4 type deployment may finally 'level' the field with many bad drivers around!
Should we expect a taxi / shuttle drivers's strike well before 2025?
@kuqiqogras, I do agree with you on your assessment of what computer vision can and what humans cannot.
I, too, was pretty amazed to find out what -- and how -- the Google Car's machine vision is seeing when driving surface roads, as the video clip shows.
As for the liability issue, it remains to see. We hear two sides of the argument. There are those who think the insurance rates would go down, if you drive a self-driving car, because it is fundamentally safter than human driving. Meanwhile, there are many who are worried about liability issues.
Hence, one of the analysts' suggestion here: Perhpas, there is a need for recognizing a driver of a self-driving cara, by issuing a separate driver's license.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.