If you just met a dog for the first time, you can't :) - my guess is LLMs are somewhere in between. It would be cool to see what happens if somebody tried to make an LLM that somehow has ethical principles (instead of guardrails) and is much less eager to please.
The stochastic parrot LLM is driven by nothing but eagerness to please. Fix that, and the parrot falls off its perch.