logoalt Hacker News

2ndorderthoughttoday at 12:15 AM2 repliesview on HN

In the past month local models have been ramping up in major way meanwhile the namesake providers have upped prices, went offline randomly, and started doing slimier and slimier things.

I really think the future is local compute. Or at least self hosted models.


Replies

SchemaLoadtoday at 12:22 AM

The hosted ones still have the advantage of being able to search the internet for live info rather than being limited to a knowledge cut off date.

show 4 replies
CSMastermindtoday at 12:24 AM

What's the rough equivalent of a local model? Are we talking GPT-4?

show 3 replies