logoalt Hacker News

Sol-yesterday at 6:29 PM3 repliesview on HN

With stuff like this, might be that all the infra build-out is insufficient. Inference demand will go up like crazy.


Replies

RGammayesterday at 7:02 PM

Unlocking the next order of magnitude of software inefficiency!

Though I do hope the generated code will end up being better than what we have right now. It mustn't get much worse. Can't afford all that RAM.

show 1 reply
kylehotchkissyesterday at 6:30 PM

It'd be nice if CC could figure out all the required permissions upfront and then let you queue the job to run overnight

Der_Einzigeyesterday at 6:53 PM

Anyone paying attention has known that demand for all type of compute than can run LLMs (i.e. GPUs, TPUs, hell even CPUs) was about to blow up, and will remain extremely large for years to come.

It's just HN that's full of "I hate AI" or wrong contrarian types who refuse to acknowledge this. They will fail to reap what they didn't sow and will starve in this brave new world.

show 5 replies