That's not how Waymo works, though. Waymo doesn't imitate humans. Waymo is trained to obey traffic laws and avoid collisions.
Waymo immitates humans insofar as its neural net trained on avoiding collisions after millions of miles of video footage and LIDAR data on roads shared with humans causes it to immitate humans.
It's likely manually programmed not to (incorrectly) turn the wheel to the left while stopped and waiting for an opportunity to turn. If you get rear-ended, you'll end up in the lane of oncoming traffic. It's certainly programmed to use its turn signals to indicate when it is going to turn. But after driving around thousands of cars without turn signals on but with their wheels pointed left, it "knows" to predict that they're about to turn, and might immitate humans by anticipating that action and moving to pass the stopped car on the right.
Waymo has published a ton about the imitation learning they've been using since 2018. They're not imitating random cars but their drivers who are paid to drive around and follow traffic laws.
It's not enough so they use heavy reinforcement learning etc. but it's still a huge foundation to build on.