logoalt Hacker News

phendrenad2yesterday at 10:28 PM1 replyview on HN

I don't really understand why AI providers don't charge like the electric company, or AWS. Instead of increasing usage limits, just charge less for off-hours use.


Replies

lxgryesterday at 10:41 PM

LLM inference is much more geographically fungible than electricity, so maybe it’s just not worth the complexity yet and there is enough (not highly latency sensitive) load on average globally.