It would be interesting to understand what the Google car algorithms would do if it prompted for the driver to take control and he/she did not?
Exactly. That's the crux of the issue. And that's the question Mr. Safety at Google did not want to give direct answer. His response was, as I wrote in the story, that we simply don't know how drivers react to such situation. Therefore, we don't know how a self-driving car should respond to that.
Wireless links are an inherent risk presently because virtually all these links use low power processors such as the ARM 7 which lack an MMU and other security features to make them immune to hacking -- Even if processors use an MMU like the intel i7 series then every bit of code in the wireless link must be written to use the MMU (built into all the applications and the RTOS/OS) -- Hardware based security is even more complex and design intensive to implement, however given the huge numbers of these chips sold each year a secure design would be a winner in the market.
@Tiger Joe. The government own the roads/freeways today and it's not a huge problem.
I meant to convey that the government were responsible (since I'd suggest they should own the automation computer control and sensor infrastructure for autonomous driving) for stopping a vehicle before it leaves the controlled road section if a driver does not respond to the prompt to take back manual control.
It would be interesting to understand what the Google car algorithms would do if it prompted for the driver to take control and he/she did not? Since it does not prompt the driver until after some event that triggered the need to change status, and one assumes that the changeover time would be several seconds at least, then at 60mph it would seem unworkable.
..is the prompt because a sensor failed? ...or is it outside it's range of response capability? ..Would it simply stop? ..or slow down? ..how suddenly? ..would it try to clear the traffic lanes to stop?
I totally agree with you that implementation should begin with freeways and controlled road sections. This reduces the the problem from it's current "boil the ocean size". If we can't make that work, then we are in trouble.
". If the infrastructure prompts the driver to take manual control again and they cannot; (asleep, drunk, ill, or simply inattentive) then the system moves the car off the highway and parks. This puts the onus on the government I know, but I think that is where the responsibility for control should rest."
That sends a strong message. There's no freaking way I'm going to hand over control of my vehicle to the government. Can you imagine, some 'incident' gets some bureaucrat to push a button, causing all traffic on the freeway to come to a standstill? We are stuck in gridlock until the situation is resolved. Given the two week shutdown we just got out of, I don't trust any quick resolution in the matter.
IMHO, the whole idea of self-driving vehicles is a 'walk before you run' problem. If we can't solve the problem to confine use on limited access highways only, forget it. By 'solve' I mean graduating it to common consumer acceptance. We would hand control of our car as soon as we get onto the freeway as readily as we do when stepping on a bus, train or plane. And we would get that control back in our hands, when taking the off-ramp.
And the problem of driving on city streets where reading signs, obstacles in traffic, lights and more, is intractible by comparison.
Whilst actually driving, many drivers today seem willing to make telephone calls, send or receive texts, put on make-up and so on.
With self driving cars, they'll be finishing that report, having a meeting over a video link, having breakfast, or maybe a motor-home driver in the toilet. It'll happen (well, I guess the latter might not if it causes the vehicle to pull off the road because the driver has 'vanished').
I always thought that a self-driving car would be great. You can just relax and read the paper on the way to work. I guess that is not ever going to happen unless you ride the train or bus. When driving with cruise control, you still have to steer and watch for obstacles. If the car did those for you, you would stop paying attention. Reminds me of the story of the Saudi prince that came to Texas and rented an RV. On the interstate in barren west Texas, he put it on cruise control and went in the back to get a beer. It rolled 8 times before coming to a stop.
@sheetal, Google has not announced its launch date; let alone where. That said, many leading carmakers have publicly talked about their goals for rolling out their first "autonomous cars" by 2020. While I take it more of their aspirational goal, some told us that self-driving cars become a reality sooner than we think.
Blog Doing Math in FPGAs Tom Burke 2 comments For a recent project, I explored doing "real" (that is, non-integer) math on a Spartan 3 FPGA. FPGAs, by their nature, do integer math. That is, there's no floating-point ...