All of your arguments are expounded upon in the article itself, and their conclusions still hold, based on the publicly available data.
The 3x figure in the title is based on a comparison of the Tesla reports with estimated average human driver miles without an incident, not based on police report data. The comparison with police-report data would lead to a 9x figure instead, which the article presents but quickly dismisses.
The denominator problem is made up. Tesla Robotaxi has only been launched in one location, Austin, and only since July (well, 28th June, so maybe there is a few days discrepancy?). So the crash data and the miles data can only refer to this same period. Furthermore, if the miles driven are actually based on some additional length of time, then the picture gets even worse for Tesla, as the denominator for those 9 incidents gets smaller.
The analysis indeed doesn't distinguish between the types of accidents, but this is irrelevant. The human driver estimates for miles driven without incident also don't distinguish between the types of incidents, so the comparison is still very fair (unless you believe people intentionally tried to get the Tesla cars to crash, which makes little sense).
The comparison to Waymo is also done based on incidents reported by both companies under the same reporting requirements, to the same federal agency. The crash definitions and reporting practices are already harmonized, at least to a good extent, through this.
Overall there is no way to look at this data and draw a conclusion that is significantly different from the article: Tesla is bad at autonomous driving, and has a long way to go until it can be considered safe on public roads. We should also remember that robotaxis are not even autonomous, in fact! Each car has a human safety monitor that is ready to step in and take control of the vehicle at any time to avoid incidents - so the real incident rate, if the safety monitor weren't there, would certainly be even worse than this.
I'd also mention that 5 months of data is not that small a sample size, despite you trying to make it sound so (only 9 crashes).
I agree with most of your points and your conclusion, but to be fair OP was asserting that human drivers under-report incidents, which I believe. Super minor bumps where the drivers get out, determine there’s barely a scratch, and go on. Or solo low speed collisions with walls in garage or trees.
I don’t think it invalidates the conclusion, but it seems like one fair point in an otherwise off-target defense.
To add to this, more data from more regions means the estimate of average human miles without an incident is more accurate, simply because it is estimated from a larger sample, so more likely to be representative.
> The 3x figure in the title is based on a comparison of the Tesla reports with estimated average human driver miles without an incident, not based on police report data. The comparison with police-report data would lead to a 9x figure instead, which the article presents but quickly dismisses.
I think OP's point still stands here. Who are people reporting minor incidents to that would be publicly available that isn't the police? This data had to come from somewhere and police reports is the only thing that makes sense to me.
If I bump my car into a post, I'm not telling any government office about it.
Statistically 9 crashes is enough to draw reasonable inferences from. If they had the expected human rate of 3 over the period in question, the chance that they would actually get into 9 accidents is about 0.4%. And mind you, that’s with a safety driver. It would probably be much worse without one.