> Many of the best models are open source, just too big to run for most people.
You can find all of the open models hosted across different providers. You can pay per token to try them out.
I just don't see the open models as being at the same quality level as the best from Anthropic and OpenAI. They're good but in my experience they're not as good as the benchmarks would suggest.
> $10k is a lot of money, but that ability to fit these very large models & get quality results is very impressive.
This is why I only appreciate the local LLM scene from a distance.
It’s really cool that this can be done, but $10K to run lower quality models at slower speeds is a hard sell. I can rent a lot of hours on an on-demand cloud server for a lot less than that price or I can pay $20-$200/month and get great performance and good quality from Anthropic.
I think the local LLM scene is fun where it intersects with hardware I would buy anyway (MacBook Pro with a lot of RAM) but spending $10K to run open models locally is a very expensive hobby.