If the hardware changes significantly and those sites don't exist in the future wouldn't that mean gemeni would degrade in quality because it has nothing to pull from?
We've all tried to ask the LLM about something outside of its training data by now.
In that situation, they give the (wrong) answer that sounds the most plausible.
> because it has nothing to pull from?
Chat rooms produce trillions of tokens per day now, interactive tokens, where AI can poke and prod at us, and have its ideas tested in the real world (by us).
This then becomes the hardware manufacturers problem. If their new hardware fails for to many users it will no longer be purchased. If they externalize their problem solving like so many companies, they won't be able to gain market share.
This creates financial incentives to pay companies running the new version of search. Your thinking of this as a problem for these companies, when in reality it is a financial incentive.
Yea so I’ve had an issue getting video output after boot on a new AMD R9700 Pro. None of the, albeit free, models from OpenAI/Google/Anthropic have really been helpful. I found the pro drivers myself. They never mentioned them.
Thats not to say AI is bad. It’s great in many cases. More that I’m worried about what happens when the repositories of new knowledge get hollowed out.
Also my favorite response was this gem from Sonnet:
> TL;DR: Move your monitor cable from the motherboard to the graphics card.
Right, that success story is only because there was "organic" (for lack of a better term) information from an original source. What happens when all information is nth generation AI feedback with all links to the original source lost?
Edit: A question from AI/LLM ignorance- Can the source database for an LLM be one-way, in that it does not contain output from itself, or other LLMs? I can imagine a quarantined database used for specific applications that remains curated, but this seems impossible on the open internet.