close
close

NHTSA is investigating more Waymo incidents, but should it?

The National Highway Transportation Safety Agency (NTS) regulates the safety of vehicles sold in the United States. In early May, they sent Waymo a letter saying they were investigating 22 incidents involving Waymo vehicles. On May 23, they wrote that they would add 9 more. The incidents date back to August 2021, but most are recent. Most of these incidents are surprisingly harmless, with no injuries and only minor property damage and a few traffic violations.

Twenty of those come from the requirement to report Waymo on any accident that involves police or involves property damage or personal injury. Eleven come from recent online reports found on social media of strange driving by Waymo vehicles, including a few where the vehicles used the oncoming lane for a significant distance to avoid traffic problems—the NHTSA seems to be reading Reddit and Twitter. The most concerning report involves an empty Waymo that crashed into a telephone pole in a narrow alley on its way to pick up passengers. Almost all of the reports can easily be classified as things that shouldn’t happen. The telephone pole incident could even be classified as “should be impossible.”

Waymo has so far declined to answer questions about these incidents and the investigation. While it is common for companies not to comment on matters that are the subject of official investigations, this is a change of course for Waymo, which has previously been much more open than Cruise, Tesla or other companies in this space.

Waymo vehicles make mistakes, and it’s understandable that the NHTSA wants to know about them. On the other hand, a certain amount of error is to be expected with these vehicles, especially in the early years but also indefinitely into the future. Nobody expects or plans for comprehensive perfection. When a self-driving vehicle makes a mistake, the public reaction is often a brief statement that it shouldn’t be on the roads if it makes mistakes. The phrase “not ready for prime time,” made famous by Saturday Night Live, is often heard. The California DMV took Cruise off the roads entirely because a serious mistake had led to serious injuries and the problems surrounding the incident had been covered up for two days, though it never mentioned which of those two reasons was more important, or if either was more important.

If regulators are going to regulate, it’s time for them to make the terms of that regulation public and decide. Regulators are responsible for improving overall road safety. That means they need to analyze data, not just crashes, unless they are part of the data or help shed light on larger trends. When tracking crashes, the key questions should be:

  1. Are the driving systems to blame? (The NHTSA appears to have done this, as its databases contain many more incidents where the systems are probably not to blame.) Current regulations do not allow for such an assessment.
  2. How serious or potentially serious is an incident?
  3. How likely is it that the incident will recur, taking into account the severity
  4. Does the provider resolve any issues promptly so that they are unlikely to occur again?
  5. Is the provider open about problems and their solutions?

Pretty insignificant

Looking at the incidents requested, there are things like hitting road debris, brushing parking barriers and chains, hitting spikes, hitting construction, hitting a parked car at low speed, but also two collisions and an intersection error that caused a moped to spin out trying to avoid the Waymo. There was also using the oncoming lane, using a transit lane, getting stuck when a detour forced cars onto the highway (which Waymo doesn’t do with passengers), and, most amusingly, getting confused while following a truck that had a tree in a trailer and obviously wondering what a tree was doing in the middle of the road. Most surprising, though, is hitting a pole, reminiscent of the time a Cruise rammed the back of a transit bus — that should be nearly impossible.

Waymo recently announced that they’ve completed one million trips and are doing 50,000 more per week, and that they’ve also completed 10 million autonomous miles. The vast majority of those trips and miles are new and likely reflect 15 human lives of driving. (They just announced 1 million miles in February 2023.) Honestly, if that’s all the NHTSA found worth investigating in that time period, that’s actually pretty good news for Waymo. Humans would likely cause 100 minor accidents (most of them not reported to insurance) and 20 police-handled accidents in 10 million miles, including multiple injuries and a 15 percent chance of a fatality.

That’s what regulators should be focusing on – averages and data. Unlike human-caused accidents, every single incident listed is very likely to have been fixed in the software shortly after it occurred, and is tested in simulation with each new version. When regulators look at specific incidents, they can only look at the past and deal only with problems that have already been resolved. If a human on the road makes a mistake, it’s a sign that other humans might make that mistake too. If a robot makes a mistake, no robot in that fleet will make the same mistake again. Therefore, regulators need to look at the statistics to determine if they are reaching an unacceptable level of risk, not the incidents.

Hit a post

But hitting a pole is unusual. A good robocar like Waymo’s has redundant systems that ensure that even if bugs in the software want the car to hit a pole, other, simpler systems will prevent it because it’s super obvious to the sensors. The same goes for the back of a bus. Something in the Waymo should have screamed—like the forward collision warning on human-driven cars—that there was a pole in front of the car and that it should stop abruptly since there’s no one in it or behind it. NHTSA is right to ask why that didn’t happen, because while mistakes happen, you also want to deal with them well, and that didn’t happen here.

Recent changes?

Waymo’s error record is so good that some wonder if there has been a recent increase in problems, while others suggest this may simply be because Waymo is driving a lot more and so there will be more problems. The incidents of driving in the opposite lane to avoid traffic suggest some sort of new programming in the system, as a map-based system like Waymo’s doesn’t confuse the two sides of the road like a Tesla might – their decision to drive on the wrong side is deliberate, but potentially incorrect. This could be due to the use of more machine learning in Waymo’s “planner” – the part of the system that chooses the path for the vehicle.

Machine learning systems are very powerful and more general, but they can also make human errors. Unfortunately, Waymo has also declined to comment on these incidents, and has done so since before the investigation. Again, regulators should be trying to understand the frequency and severity of incidents, not fix past problems. It’s worth noting that NHTSA doesn’t regulate the rules of the road – that’s a state matter. It violated that principle when it ordered Tesla not to conduct rolling stops, and may continue to do so.