logoalt Hacker News

dathinablast Wednesday at 5:48 PM1 replyview on HN

> works only as a [generative] information retrieval

but even if we for simplicity of the argument assume that is true without question, LLM still are here to stay

Like think about how do junior devs which (in programming) average or less skill work, they "retrieve" the information about how to solve the problem from stack overflow, tutorials etc.

So giving all your devs some reasonable well done AI automation tools (not just a chat prompt!!) is like giving each a junior dev to delegate all the tedious simple tasks, too. Without having to worry about that task not allowing the junior dev to grow and learn. And to top it of if there is enough tooling (static code analysis, tests, etc.) in place the AI tooling will do the write things -> run tools -> fix issues loops just fine. And the price for that tool is like what, a 1/30th of that of a junior dev? Means more time to focus on the things which matter including teaching you actual junior devs ;)

And while I would argue AI isn't full there yet, I think the current fundation models _might_ already be good enough to get there with the right ways of wiring them up and combining them.


Replies

slt2021last Wednesday at 10:09 PM

Programming languages are created by humans and the training dataset is complete enough to train LLMs with good results. Most importantly, natural language is the native domain of the programming code.

Whereas in biology, the natural domain is in physical/chemical/biological reactions occuring between organisms and molecules. The laws of interactions are not created by human, but by Creator(tm), and so the training dataset is barely capturing a tiny fraction and richness of the domain and its interactions. Because of this, any model will be inadequate