logoalt Hacker News

Seviiyesterday at 8:18 PM2 repliesview on HN

Apple's goal is likely to run all inference locally. But models aren't good enough yet and there isn't enough RAM in an iPhone. They just need Gemini to buy time until those problems are resolved.


Replies

kennywinkeryesterday at 8:25 PM

That was their goal, but in the past couple years they seem to have given up on client-side-only ai. Once they let that go, it became next to impossible to claw back to client only… because as client side ai gets better so does server side, and people’s expectations scale up with server side. And everybody who this was a dealbreaker for left the room already.

show 1 reply
O5vYtytbyesterday at 11:03 PM

Well DRAM prices aren't going down soon so I see this as quite the push away from local inference.