logoalt Hacker News

Maxataryesterday at 8:35 PM1 replyview on HN

I have not found this to be the case. My company has some proprietary DSLs we use and we can provide the spec of the language with examples and it manages to pick up on it and use it in a very idiomatic manner. The total context needed is 41k tokens. That's not trivial but it's also not that much, especially with ChatGPT Codex and Gemini now providing context lengths of 1 million tokens. Claude Code is very likely to soon offer 1 million tokens as well and by this time next year I wouldn't be surprised if we reach context windows 2-4x that amount.

The vast majority of tokens are not used for documentation or reference material but rather are for reasoning/thinking. Unless you somehow design a programming language that is just so drastically different than anything that currently exists, you can safely bet that LLMs will pick them up with relative ease.


Replies

joshstrangeyesterday at 9:24 PM

> Claude Code is very likely to soon offer 1 million tokens as well

You can do it today if you are willing to pay (API or on top of your subscription) [0]

> The 1M context window is currently in beta. Features, pricing, and availability may change.

> Extended context is available for:

> API and pay-as-you-go users: full access to 1M context

> Pro, Max, Teams, and Enterprise subscribers: available with extra usage enabled

> Selecting a 1M model does not immediately change billing. Your session uses standard rates until it exceeds 200K tokens of context. Beyond 200K tokens, requests are charged at long-context pricing with dedicated rate limits. For subscribers, tokens beyond 200K are billed as extra usage rather than through the subscription.

[0] https://code.claude.com/docs/en/model-config#extended-contex...