Automakers are jumping at the prospects of delivering affordable autonomous mobility -- many already pledging self-driving vehicles by 2020.
Understandably, the first self-driving cars will be more like an autopilot-like system similar to what you might see in an airplane, rather than the sleek Audi from the film iRobot. You have probably seen pictures of self-driving car prototypes awkwardly sporting equipment that looks like it belongs on the International Space Station.
One of the greatest hurdles we need to clear before we start seeing these types of cars on the road is not whether we can make the technology that will allow a car to drive itself from point A to B, but whether we can strike a balance in extending human driving abilities into the world of artificial intelligence.
Full autonomy still requires constant supervision. There is no system that can yet match a human driver’s ability to respond to the unexpected. The last thing we want to do is leave a confused car in control. There needs to be mechanisms in place that allow the car to know when and how much control to relinquish to the driver based on our physical and emotional states.
Car determines your concentration level
This presents us with a two-step equation to solve. The first part involves the car gathering data on the status of the driver so that it can determine concentration, attention, and emotional state when the driver needs to take over control under varying conditions.
Cars will need to be outfitted with a network of sensors -- seatbelts that monitor heart rates and breathing, steering wheels that measure skin temperature, eye movement tracking that senses gaze, pupil dilation, and even spatial orientation of the driver’s head. Voice recognition software will detect your mood as you interact with the car’s infotainment and navigation systems.
These biologic readings will be used to calculate the emotional state of the driver -- are you hysterical, angry, or sleepy? Drowsy driving causes more than 100,000 crashes every year, according to the National Highway Traffic Safety Administration.
Once the physical and emotional state of the driver has been determined by the car, it will engage an adaptive automation system allowing the car to activate or disengage Advanced Driver Assistance Systems features depending on the level of distraction or attentiveness of the driver.
Cars wants your attention back
This brings us to the second part of this equation -- how to safely and effectively gain the driver’s attention to inform them that human control is required in a particular situation. The three most relevant human senses that will need to be engaged are sight, hearing, and touch.
This can be accomplished through lights and signals, sounds and vibrations, but more importantly is the combination and extent to which these signals are deployed based on the individual driver and the immediacy of the situation.
A driver who likes to sleep in their self-driving car is going to need a little more of a jolt than someone sipping coffee and catching up on email. Similarly, entering a freeway will send more of an FYI signal whereas the need to swerve will require an immediate and noticeable alarm to ensure a timely maneuver.
Of course, because all drivers are different and each driving situation is unique, the car will need to learn when and how to notify each driver when it needs them to take control. Sights and sounds that are too flashy or loud can startle, causing hasty reactions and errors in judgement. It won’t be a one-size-fits-all solution; no one wants to get blasted by sirens every time they enter a construction zone.
As self-driving cars begin to merge into the mainstream automotive industry, we will see as many sensors inside the car as outside. After all, the most important piece in the autonomous vehicle puzzle is us... the driver.
What do you think about your car watching your every move?
-- Davide Santo is ADAS Microcontroller Product Line Manager at Freescale.