logoalt Hacker News

sornaensisyesterday at 11:52 PM3 repliesview on HN

I'm curious what issues you had with haskell? I have had the opposite experience and find them dreadful at Java et al.

Surely, denser languages should be better for LLMs?


Replies

hgoeltoday at 12:58 AM

The context window also limits how deeply the model can "think", and it does this in natural language. So a language suited to LLMs would have balanced density, if it's too dense, the model spends many tokens working through the logic, if it's too sparse, it spends many tokens to read/write the code.

I think in the context of already trained LLMs, the languages most suited to LLMs are also the ones most suited to humans. Besides just having the most code to train on, humans also face similar limitations, if the language is too dense they have to be very careful in considering how to do something, if it's too sparse, the code becomes a pain to maintain.

show 1 reply
danpalmertoday at 12:04 AM

Density is a double edged sword. On the one hand you want to minimise context usage, but on the other hand more text on the page means more that the LLM can work with.

zemtoday at 1:07 AM

my (uninformed) speculation is that you want resilience and error correction, which implies some level of redundancy rather than pure density.