logoalt Hacker News

scoofyyesterday at 9:38 AM2 repliesview on HN

>The cars already know those are intersections with lights.

That's not how any of this works. You can anthropomorphize all you like, but they don't "know" things. They're only able to predictably respond to their training data. A blackout scenario is not in the training data.


Replies

Dylan16807yesterday at 6:53 PM

Even ignoring the observations we can make, the computers have maps programmed in. Yes they do know the locations of intersections, no training necessary.

And the usual setup of an autonomous car is an object recognition system feeding into a rules system. If the object recognition system says an object is there, and that object is there, that's good enough to call "knowing" for the purpose of talking about what the cars should do.

Or to phrase things entirely differently: Finding lights is one of the easy parts. It's basically a solved problem. Cutting your speed when there isn't a green or yellow light is table stakes. These cars earn 2 good boy points for that, and lose 30 for blocking the road.

floxyyesterday at 6:03 PM

>They're only able to predictably respond to their training data. A blackout scenario is not in the training data.

Is there anyway to read more about this? I'm skeptical that there aren't any human coded traffic laws in the Waymo software stack, and it just infers everything from "training data".