No. Absolutely not. The opposite in fact. Your bash script is deterministic. You can send it to 20 AIs or have someone fluent read it. Then you can be confident it’s safe.
An LLM will run the probabilistically likely command each time. This is like using Excel’s ridiculous feature to have a cell be populated by copilot rather than having the AI generate a deterministic formula.
No. Absolutely not. The opposite in fact. Your bash script is deterministic. You can send it to 20 AIs or have someone fluent read it. Then you can be confident it’s safe.
An LLM will run the probabilistically likely command each time. This is like using Excel’s ridiculous feature to have a cell be populated by copilot rather than having the AI generate a deterministic formula.