Fender bender in Arizona illustrates Waymo’s commercialization challenge
A police report obtained by the Phoenix New Times this week reveals a minor Waymo-related crash that occurred last October but hadn’t been publicly reported until now. Here’s how the New Times describes the incident:
A white Waymo minivan was traveling westbound in the middle of three westbound lanes on Chandler Boulevard, in autonomous mode, when it unexpectedly braked for no reason. A Waymo backup driver behind the wheel at the time told Chandler police that “all of a sudden the vehicle began to stop and gave a code to the effect of ‘stop recommended’ and came to a sudden stop without warning.”
A red Chevrolet Silverado pickup behind the vehicle swerved to the right but clipped its back panel, causing minor damage. Nobody was hurt.
Overall, Waymo has a strong safety record. Waymo has racked up more than 20 million testing miles in Arizona, California, and other states. This is far more than any human being will drive in a lifetime. Waymo’s vehicles have been involved in a relatively small number of crashes. These crashes have been overwhelmingly minor with no fatalities and few if any serious injuries. Waymo says that a large majority of those crashes have been the fault of the other driver. So it’s very possible that Waymo’s self-driving software is significantly safer than a human driver.
At the same time, Waymo isn’t acting like a company with a multi-year head start on potentially world-changing technology. Three years ago, Waymo announced plans to buy “up to” 20,000 electric Jaguars and 62,000 Pacifica minivans for its self-driving fleet. The company hasn’t recently released numbers on its fleet size, but it’s safe to say that the company is nowhere near hitting those numbers. The service territory for the Waymo One taxi service in suburban Phoenix hasn’t expanded much since it launched two years ago.
Waymo hasn’t addressed the slow pace of expansion, but incidents like last October’s fender-bender might help explain it.
It’s hard to be sure if self-driving technology is safe
Rear-end collisions like this rarely get anyone killed, and Waymo likes to point out that Arizona law prohibits tailgating. In most rear-end crashes, the driver in the back is considered to be at fault. At the same time, it’s obviously not ideal for a self-driving car to suddenly come to a stop in the middle of the road.
More generally, Waymo’s vehicles sometimes hesitate longer than a human would when they encounter complex situations they don’t fully understand. Human drivers sometimes find this frustrating, and it occasionally leads to crashes. In January 2020, a Waymo vehicle unexpectedly stopped as it approached an intersection where the stoplight was green. A police officer in an unmarked vehicle couldn’t stop in time and hit the Waymo vehicle from behind. Again, no one was seriously injured.
It’s difficult to know if this kind of thing happens more often with Waymo’s vehicles than with human drivers. Minor fender benders aren’t always reported to the police and may not be reflected in official crash statistics, overstating the safety of human drivers. By contrast, any crash involving cutting-edge self-driving technology is likely to attract public attention.
The more serious problem for Waymo is that the company can’t be sure that the idiosyncrasies of its self-driving software won’t contribute to a more serious crash in the future. Human drivers cause a fatality about once every 100 million miles of driving—far more miles than Waymo has tested so far. If Waymo scaled up rapidly, it would be taking a risk that an unnoticed flaw in Waymo’s programming could lead to someone getting killed.
And crucially, self-driving cars are likely to make different types of mistakes than human drivers. So it’s not sufficient to make a list of mistakes human drivers commonly make and verify that self-driving software avoids making them. You also need to figure out if self-driving cars will screw up in scenarios that human drivers deal with easily. And there may be no other way to find these scenarios than with lots and lots of testing.
Waymo has logged far more testing miles than other companies in the US, but there’s every reason to think Waymo’s competitors will face this same dilemma as they move toward large-scale commercial deployments. By now, a number of companies have developed self-driving cars that can handle most situations correctly most of the time. But building a car that can go millions of miles without a significant mistake is hard. And proving it is even harder.
https://arstechnica.com/?p=1754028