logoalt Hacker News

hu3today at 3:15 AM1 replyview on HN

Indeed but:

1) That is relatively very slow.

2) Can also be done, simpler even, with SoTA models over API.


Replies

yogthostoday at 3:35 AM

Right, this works with any models. To me, the most interesting part is that you can use a smaller model that you could run locally to get results comparable to SoTA models. Ultimately, I'd far prefer running local, even if slower, for the simple reason of having sovereignty over my data.

Being reliant on a service means you have to share whatever you're working on with the service, and the service provider decides what you can do, and make changes to their terms of service on a whim.

If locally running models can get to the point where they can be used as a daily driver, that solves the problem.