MADISON, Wis. – Almost a year ago Elon Musk famously proclaimed: “I really consider autonomous driving a solved problem.”
Given all the advancements of Artificial Intelligence and a rash of announcements about business and technology firms partnering to develop robo-cars, the self-driving promise seems self-evident.
Tech companies and carmakers are sticking to a self-imposed deadline to roll out sometime between 2019 and 2021 their first Level 4/Level 5 autonomous cars. Nobody is publicly backpedaling — at least not yet.
The business and investment community understands — and encourages — these business aspirations for autonomous vehicles.
Under the hood, though, the engineering community is staring at multiple problems for which they don’t yet have technological solutions.
At the recent Massachusetts Institute of Technology-hosted event called the “Brains, Minds and Machines Seminar” series, Amnon Shashua, co-founder and CTO of Mobileye, spoke bluntly: “When people are talking about autonomous cars being just around the corner, they don’t know what they are talking about.”
But Shashua is no pessimist. As a business executive, Shashua said, “We are not waiting for scientific revolution, which could take 50 years. We are only waiting for technological revolution.”
Open questions Given these parameters, what open questions still need a technological revolution to be answered?
Consumers have already seen pod cars scooting around Mountain View, Calif. An Uber car — in autonomous driving mode — recently collided with a left-turning SUV driven by a human in Arizona.
It’s time to separate the “science-project” (as Shashua calls it) robotic car — doing a YouTube demo on a quiet street — from the commercially viable autonomous vehicle that carmakers need but don’t have.
As EE Times listened to Mobileye’s CTO, as well as several scholars, numerous industry analysts and an entrepreneur working on “perception” in robo-cars, the list of “open issues” hobbling the autonomous vehicle industry has gotten longer.
Some issues are closely related, but in broad strokes, we can squeeze them into five bins: 1) autonomous cars’ driving behavior (negotiating in dense traffic), 2) more specific and deeper “reinforcement” for learning and edge cases, 3) testing and validation (can we verify safety on AI-driven cars?), 4) security and anti-tampering (preventing a driverless car from getting hacked), and 5) the more philosophical but important question of “how good is good enough” (because autonomous cars won’t be perfect).
They can't see the road markings when covered in snow. The sensors get covered in Ice. Oh no, road construction, cones, can't deal with hit. Oh the county didnt stripe the lanes right can't deal with it. Oh look a pothole, can't detect it, pow.
Another Issue is LEGAL.
When it a defective part Kills you will you be able to sue the manufacturer into bankruptcy? They can't even design an airbag or ignition switch without killing someone. Think about the complexity of automony compared to a ignition switch. Lawyers will love this because the manufacturer is now responsible for the safe operation of the vehicle!
I read a book - Version Control by Dexter Palmer - which is about a "causality violation device" (aka Time Machine) which has a central plot point based on an autonomous car accident, including a pretty good (and scary) explanation for why the accident happens. I'll leave the details out as I don't want to spoil it for anyone who hasn't read it yet. If you are into science fiction then I highly recommend it.
I would not rely only on deep 'reinforcement learning' to get "well behaving" car. If deep learning is OK to manage complex situation but cannot guarantee corner cases will be safe, add it a guard using programming by rules. These rules just would say "don't do that".
BTW, this is how we, human, learn to manage every days' tasks. And this how I teached my kids to behave. Even when still babies, there were almost free in house, but when starting something dangerous (for them or for some fragile thing), my wife or myself did prevent that, using both specific voice and removing them from litigeous place. Beside that, it was up to them to find something interesting to do. Looks to have work rather well (younger kid is now 18, and I do not remember them breaking more stuff than myself).
CaaS requires a certain level of utilization to be economically viable and in rural it is harder to achieve high utilization. Car sharing and CaaS with larger vehicles (minibuses and up) can grab considerable share though. In locations where a larger percentage of the population lives in rural, income is usually much lower and such locations are far more cost sensitive. Car sharing , minibuses and used cars are the likely winners - the market for used cars will collapse once CaaS starts to take over in urban. The domino effect on new car sales and ASPs will be quite interesting and something that folks are yet to consider.
However, a dirt road is not really off-road and the robot can learn any such roads. If you consider agricultural robots, maybe off-road capabilities will arrive sooner than expected.
In urban there is no contest , consumers can save a lot of money with CaaS and the CaaS provider can offer different types of vehicles that fit any and all needs. This area offers opportunities for innovation and should be intereasting.
In America, we already have a driving culture that sees no value in building driver skills. We've added texting to many driver's regular routing, and we are seeing the results in a turnaround of the fatality rate, now going up.
Excepting for a driverless in-city car, autonomous vehicles will have to disengage and hand off the driving to a human and this will happen in the worst situation, perhaps a blinding rain or even a snow storn. So the compter hands off control at the worst possible time to a human with poor skills and who, at best, hasn't been paying attention, may be reading, or is even asleep. And without the need to actually drive most of the time, that driver's poor skills will deteriorate even further.
In an airliner, the pilots use autopilot most of the time, but they have intense training and are always in a position to take manual control whenever necessary.
@HankWalker: I am thinking of all my friends in farming, ranching, oil and gas, etc., where they spend a lot of time off-road for work. About 15% of the U.S. population is rural, which I will use as a proxy for off-road.
Fair enough. But how many of them will want a fully autonomous off-road vehicle?
Most off-road vehicles spend most time on a highway getting to where they will go off-road. Autonomous mode will be appropriate for highway driving, with the human taking over in the off-road portion.
The value for humans in the autonomous vehicle market is "Get in, tell the car where you want to go, and sit back and enjoy the ride." A potential roadblock is humans who want to drive the car rather than allow the car to do it for them.
A big selling point for fully autonomous cars is potential safety. How many accidents that have resulted in serious injury or death have been caused by a driver in impaired condition behind a wheel? Autonomous driving can dramatically reduce that, as the impaired human likely won't be driving.
And when you are off-road, what does "telling the car where you want to go?" mean? How do you specify the destination? It's not like a city steet address.