If piping unfiltered user into exec() is a security nightmare, so is piping unfiltered user input into an LLM that can interact with your systems, except in this case you just have to ask it nicely for it to perform the attack, and it will work out how to do the attack for you