logoalt Hacker News

IX-103yesterday at 11:21 AM1 replyview on HN

Yes, but they're also slower. As LLMs start to be used for more general purpose things, they are becoming a productivity bottle-neck. If I get a mostly right answer in a few seconds that's much better than a perfect answer in 5 minutes.

Right now the delay for Google's AI coding assistant is high enough for humans to context switch and do something else while waiting. Particularly since one of the main features of AI code assistants is rapid iteration.


Replies

janalsncmyesterday at 2:14 PM

Anecdotally, Gemini pro is way faster than GPT 5 thinking. Flash is even faster. I have no numbers though.