logoalt Hacker News

hnfongyesterday at 11:06 PM0 repliesview on HN

Reproducibility of results are also important in some cases.

There are consumer-ish hardware that can run large models like DeepSeek 3.x slowly. If you're using LLMs for a specific purpose that is well-served by a particular model, you don't want to risk AI companies deprecating it in a couple months and push you to a newer model (that may or may not work better in your situation).

And even if the AI service providers nominally use the same model, you might have cases where reproducibility requires you use the same inference software or even hardware to maintain high reproducibility of the results.

If you're just using OpenAI or Anthropic you just don't get that level of control.