> then eventually it will go down enough that we will run on our LLMs locally and Anthropic will go out of business.
I want robust local LLMs as much as the next person—Gemma E2B, 3.2GB does my word completions as I type. It's gotten to the point where it knows what I'm going to type before I do!
But I don't see Anthropic going out of business anytime soon. As good as some of the open source LLMs are, we’re still a long way from being able to frontier models at home.