logoalt Hacker News

maxlohtoday at 10:59 AM2 repliesview on HN

In my experience with Gemini, most of its capabilities stem from web searching instead of something it has already "learned." Even if you could obtain the model weights and run them locally, the quality of the output would likely drop significantly without that live data.

To really have local LLMs become "good enough for 99% of use cases," we are essentially dependent on Google's blessing to provide APIs for our local models. I don't think they have any interest in doing so.


Replies

ameliustoday at 11:11 AM

That's totally not my experience. The AI component (as opposed to the knowledge component) is really what makes these models useful, and you could add search as a tool. Of course for that you'll be dependent on a search provider, that's true.

kavalgtoday at 11:34 AM

Unless you can provide a (community) curated list of sources to search through (e.g. using MCP). Then I think local models may become really competitive.