In the past month local models have been ramping up in major way meanwhile the namesake providers have upped prices, went offline randomly, and started doing slimier and slimier things.
I really think the future is local compute. Or at least self hosted models.
What's the rough equivalent of a local model? Are we talking GPT-4?
The hosted ones still have the advantage of being able to search the internet for live info rather than being limited to a knowledge cut off date.