logoalt Hacker News

Waymo robotaxi hits a child near an elementary school in Santa Monica

266 pointsby voxadamtoday at 2:08 PM470 commentsview on HN

Comments

BugsJustFindMetoday at 3:28 PM

From the Waymo blog...

> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.

> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.

> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.

I honestly cannot imagine a better outcome or handling of the situation.

show 26 replies
aanettoday at 6:53 PM

This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.

Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.

Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.

This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.

[1] Yes, I'm an AV safety expert

[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...

(edit: verbiage)

show 2 replies
maerF0x0today at 4:58 PM

Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.

I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .

> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.

A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.

show 2 replies
dlgtoday at 6:12 PM

I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.

While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.

If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.

show 4 replies
bhewestoday at 7:46 PM

The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.

show 2 replies
Zigurdtoday at 6:19 PM

Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.

aucisson_masquetoday at 10:16 PM

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post.

The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.

elzbardicotoday at 11:24 PM

I don't like even the very idea o self-driving cars, but based on the description of the accident, I think the machine passed this with flying colors.

Zopieuxtoday at 7:18 PM

Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.

Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:

* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,

* safe, separated lanes for biking/walking when that's an option.

aimortoday at 6:10 PM

The school speed limit there is 15 mph, and that wasn't enough to prevent an accident.

https://www.yahoo.com/news/articles/child-struck-waymo-near-...

https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac

show 4 replies
Bukhmanizertoday at 4:00 PM

Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly

show 2 replies
simojotoday at 3:46 PM

I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.

This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.

show 2 replies
CaliforniaKarltoday at 5:52 PM

For reference, here's a link to Waymo's blog post: https://waymo.com/blog/2026/01/a-commitment-to-transparency-...

moktonartoday at 9:43 PM

The Waymo driver tech is impressive. That said an experienced driver might have recognized the pattern where a stopped big vehicle occludes a part of the road leading to such situation, and might have stopped or slowed down almost to a halt before passing. The Waymo driver reacts faster but is not able to predict such scenarios by filling the gaps, simulating the world to inform decisions. Chapeau to Waymo anyways

show 2 replies
NoGravitastoday at 5:44 PM

That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.

Veservtoday at 7:42 PM

Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.

US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.

However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.

Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:

A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.

B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.

C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.

[1] https://afdc.energy.gov/data/10315

[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...

[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...

[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...

[5] https://waymo.com/safety/impact/

show 3 replies
WarmWashtoday at 3:56 PM

Oddly I cannot decide if this is cause for damnation or celebration

Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

show 1 reply
pmontratoday at 5:37 PM

Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?

show 3 replies
RomanPushkintoday at 10:11 PM

Will post it here:

> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety

It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?

show 1 reply
Dlanvtoday at 6:02 PM

Basically Waymo just prevented a kids potential death.

Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.

show 1 reply
Archiotoday at 6:25 PM

It's hard to imagine how any driver could have reacted better in this situation.

The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.

jeffrallentoday at 11:07 PM

When is enough, enough? Software devs working on autonomous driving: look in your soul and update your resume.

koolbatoday at 6:21 PM

> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?

show 4 replies
insane_dreamertoday at 7:41 PM

Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.

But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).

show 1 reply
IAmBroomtoday at 8:30 PM

The statistically relevant question is: How many human drivers have hit children near elementary schools, since Waymo's last accident?

If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.

A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.

Is Waymo statistically safer? (spoiler: yes)

show 1 reply
fortran77today at 6:29 PM

I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.

Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!

show 2 replies
qwertyuiop_today at 8:29 PM

Couldn’t be anymore callous and clinical. This press release alone makes me want not to use their service.

* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*

bpodgurskytoday at 3:36 PM

A human driver would most likely have killed this child. That's what should be on the ledger.

show 5 replies
xnxtoday at 5:56 PM

Alternate headline: Waymo saves child's life

show 1 reply
tekno45today at 6:18 PM

can we just get waymo tech in busses?

Big vehicles that demand respect and aren't expected to turn on a dime, known stops.

henningtoday at 3:49 PM

Q: Why did the self-driving car cross the road?

A: It thought it saw a child on the other side.

show 1 reply
whynotminottoday at 4:04 PM

I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.

Any accident is bad. But accidents involving children are especially bad.

show 1 reply
ripped_britchestoday at 5:54 PM

Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company

alkonauttoday at 3:44 PM

And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.

I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

show 11 replies
joshribakofftoday at 3:31 PM

> The vehicle remained stopped, moved to the side of the road

How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.

show 2 replies
jsroznertoday at 6:36 PM

So many tech lovers defending waymo.

If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.

Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.

The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.