And even if you can guarantee it asks permission to do X, LLMs aren't reliable narrators of their own actions