logoalt Hacker News

idle_zealottoday at 7:36 AM2 repliesview on HN

> and only calling out to large models when they actually need the extra knowledge

When would you want lossy encoding of lots of data bundled together with your reasoning? If it is true that reasoning can be done efficiently with fewer parameters it seems like you would always want it operating normal data searching and retrieval tools to access knowledge rather than risk hallucination.

And re: this discussion of large data centers versus local models, do recall that we already know it's possible to make a pretty darn clever reasoning model that's small and portable and made out of meat.


Replies

dryarzegtoday at 9:51 AM

> we already know it's possible to make a pretty darn clever reasoning model

There's is a problem though: we know that it is possible, but we don't know how to (at least not yet and as far as I am aware). So we know the answer to "what?" question, but we don't know the answer to "how?" question.

adrianNtoday at 8:05 AM

I would call brains with the needed support infrastructure small.