logoalt Hacker News

kburmantoday at 6:59 AM1 replyview on HN

An LLM is optimized for its training data, not for newly built formats or abstractions. I don’t understand why we keep building so-called "LLM-optimized" X or Y. It’s the same story we’ve seen before with TOON.


Replies

ImJasonHtoday at 1:41 PM

Yeah fwiw I agree. I was impressed at how well the agents were able to understand and write their invented language, but fundamentally they're only able to do that because they've been trained on "similar" code in many other languages.