Not everyone believes that a robo-shuttle/truck crash in Las Vegas was just a minor glitch. Did those involved in the operation and design of the AV shuttle carry out enough of an in-depth risk assessment of the area?
A delivery truck slowly backed up. An automated shuttle, right behind it, patiently waited to get hit by the truck.
Now, tell me. What’s wrong with this picture? Why did the self-driving shuttle just freeze like a deer in the taillights? Why didn’t it lean on the honk to warn the truck driver?
The aforementioned fender-bender took place in Las Vegas on Nov. 8th. A pod-like 8-seat automated shuttle, designed by French company Navya, collided with a human-driven truck. Inconveniently, the mishap occurred within a few hours of the ceremony that launched the new shuttle service, sponsored and operated by AAA, the Regional Transportation Commission of Southern Nevada, and French private transportation company Keolis.
Las Vegas police issued a misdemeanor citation to the truck driver for unsafe backing. The shuttle suffered a crumpled front fender. The automated shuttle bravely resumed service a day after the accident.
Not so fast. While the city of Las Vegas downplayed the crash, four investigators from the U.S. National Transportation Safety Board (NTSB) arrived anyway, on Friday, Nov. 10th.
The Federal agency wants to know more about “how self-driving vehicles interact with their environment and the other human-driven vehicles around them,” according to NTSB. “While there have been other crashes of self-driving vehicles, this crash is the first of a self-driving vehicle operating in public service,” the agency said in a statement.
While NTSB has yet to issue the report, some eyewitnesses' acounts have gotten the experts going with their own deductions.
Meet Carlos Holguin, CEO of AutoKab.
Holguin is no ordinary AV startup CEO. He’s an expert on safety and urban integration for highly automated vehicles. He was a member of the French team that developed the world’s first automated pilot services in public streets in 2011, and then went on to define safety guidelines for the second and third and fourth pilot programs.
After the Navya’s shuttle accident in Las Vegas, Holguin was the first to point out: “Everyone blames the human (driver). But is he really guilty?”
While acknowledging human fault in this case, Holguin concluded “The fault of the truck (human) driver is relative, as we think everything that could be done to prevent such accident was likely not done, and the shuttle’s system designers were also (at least partly) at fault.”
Getting AV off -- scot free?
In press coverage of the crash, Las Vegas is thus far winning a public relations battle. In its own blogpost, the city veritably gloated over the automated shuttle’s official innocence and successfully promoted the following narrative:
The [automated] shuttle did what it was supposed to do, in that it’s sensors registered the truck and the shuttle stopped to avoid the accident.
Unfortunately, the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.
Word for word, the city’s PR blogpost ended up in every media outlet’s first report on the accident. This fostered a public perception that the Vegas crash was a minor glitch.
This brushoff feeds into a commonly accepted belief that every minor accident is just a part of the learning curve that society must accept as we progress toward the ultimate road safety – on roads purged of the human drivers who are not always fully aware of their surroundings.
I happened to interview Holguin in Versailles, France, on the same day the robo-shuttle service was launched in Las Vegas, but several hours before the accident.
As we exchanged emails after the crash, I learned that for someone like Holguin -- with years of AV experience on public streets -- the crash in Las Vegas was no minor glitch. It was important and, for Holguin, inexcusable.
How should the automated shuttle have been designed? Was there anything the AV could have done?
Of course, the shuttle stopped. It was programmed to do so. But the bigger challenge, as I see it, is that the shuttle wasn’t programmed to anticipate the truck driver’ next move. The robocar wasn’t thinking on its feet… well, tires.
More importantly, what was missing in the shuttle operator’s plan for the AV pilot program?