logoalt Hacker News

blibbletoday at 12:24 AM2 repliesview on HN

> But dollars to doughnuts someone will try something like this on a waymo taxi the minute it hits reddit front page.

and once this video gets posted to reddit, an hour later every waymo in the world will be in a ditch


Replies

theamktoday at 5:35 AM

Given Waymo's don't actually connect LLMs to wheels, they are pretty safe.

Even if you fool the sign-recognizing LLM with prompt injection, it'll be an equivalent of wrong road sign. And Waymo is not going to drive into the wall even if someone places a "detour" sign pointing there.

skybriantoday at 12:45 AM

Alternatively, it happens once, Waymo fixes it, and it's fixed everywhere.

show 1 reply