logoalt Hacker News

mschuster91today at 8:35 AM1 replyview on HN

There are multiple problems here.

For one, not everyone in this world lives on high bandwidth unmetered connections. In Germany, you got a lot of people still running on 16 MBit/s ADSL, that's half an hour worth of full load just for AI garbage. With the average 50 MBit/s, it's still 10 minutes. For those running on hotspots - be it their phone with often enough 10 GB or less on your average data plan or train hotspots that cut you off after 200MB - the situation is similarly dire.

The other thing is storage. I got a nominally 256GB MacBook Air. Of these 256 GB, easily 50GB are already gone for macOS itself, swap, Recovery and everything that macOS doesn't store as part of the immutable partition (such as, you guessed it, its own AI models). Taking up 2% of the disk space without consent is definitely Not Cool.


Replies

keyringlighttoday at 9:07 AM

Another angle is the processing cost, I assume Google is seeking to offload the computation for whatever features this covers from their own data centers to end users. On the scale of billions that's probably measurable and from google's side worth doing whether the users is paying for the service or not, and each of them will have more power usage with some reduced battery life on portable devices. At that scale I'd also wonder about efficiency based on what proportion of end users are using AI or running it on CPU/GPU/NPU.