i'd say it's more like intentionally choosing to use naive string interpolation for SQL queries than a trusted library's parameter substitution. Both work.
There is no "parameter substitution" equivalent possible. Prompt injection isn't like SQL injection, it has no technical solution (that isn't AGI-complete).
Prompt injection is "social engineering" but applied to LLMs. It's not a bug, it's fundamentally just a facet of its (LLM/human) general nature. Mitigations can be placed, at the cost of generality/utility of the system.
There is no "parameter substitution" equivalent possible. Prompt injection isn't like SQL injection, it has no technical solution (that isn't AGI-complete).
Prompt injection is "social engineering" but applied to LLMs. It's not a bug, it's fundamentally just a facet of its (LLM/human) general nature. Mitigations can be placed, at the cost of generality/utility of the system.