logoalt Hacker News

littlestymaartoday at 8:07 AM1 replyview on HN

The latency argument is terrible. Of course frontier LLMs are slow and costly. But you don't need Claude to drive a natural language interface, and an LLM with less than 5B parameters (or even <1B) is going it be much faster than this.


Replies

Al-Khwarizmitoday at 8:19 AM

And it's highly circumstancial, as LLM efficiency keeps improving as the tech matures.