From the Waymo blog...
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
"from behind a tall SUV, "
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
An honest account of this situation would place at least some blame on there being a tall SUV blocking visibility.
These giant SUVs really are the worst when it comes to child safety
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
I suspect the robotaxi may have done better than a human.
Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
Can’t trust a private company.
Where is the video recording ?
> I honestly cannot imagine a better outcome or handling of the situation.
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
> reducing speed from approximately 17 mph
Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?
When I was a boy, I ran into the street from between two parked cars. I did not notice the car coming, but he noticed me popping out from nowhere, and screeched to a stop.
I was very very lucky.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions. That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
> remained stopped, moved to the side of the road
Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?
This is great.
what about all the traffic violations though?
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
It’s great handling of the situation. They should release a video as well.
Take that particular Waymo car off the road. Seems absurd, but they still hit someone.
Waymo driver? The vehicles are autonomous. I otherwise applaud Waymo's response, and I hope they are as cooperative as they say they will be. However, referring to the autonomous vehicle as having a driver is a dangerous way to phrase it. It's not passive voice, per se, but it has the same effect of obscuring responsibility. Waymo should say we, Waymo LLC, subsidiary of Alphabet, Inc., braked hard...
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
We should take their reporting with grain of salt and wait for official results
In fact I would call that “superhuman” behavior across the board.
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
> I honestly cannot imagine a better outcome or handling of the situation.
> From the Waymo blog
Yeah, like, no shit Sherlock. We'd better wait for some videos before making our opinions.
> I honestly cannot imagine a better outcome or handling of the situation.
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
Most humans in that situation won't have reaction speed to do shit about it and it could result in a severe injury or death.
Yup. And to add
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.