logoalt Hacker News

kilroy123yesterday at 3:58 PM1 replyview on HN

Thank you. All these people applauding Apple for not jumping on the bandwagon.

When in reality, they _wanted_ to but have become so dysfunctional organization wise, they weren't able to. Kind of funny how that worked out.

I still think they're really dropping the ball. They could have local models running on devices, interfacing with a big cloud partner (Google, OpenAI, etc.) Make Siri awesome. But no.


Replies

user34283yesterday at 5:16 PM

There is no use case for local models.

See Gemini Nano. It is available in custom apps, but the results are so bad; factual errors and hallucinations make it useless. I can see why Google did not roll it out to users.

Even if it was significantly better, inference is still slow. Adding a few milliseconds of network latency for contacting a server and getting a vastly superior result is going to be preferable in nearly all scenarios.

Arguments can be made for privacy or lack of connectivity, but it probably does not matter to most people.

show 2 replies