logoalt Hacker News

klysmlast Friday at 10:53 PM1 replyview on HN

It's definitely interesting that some neural nets can reduce compute requirements, but that's certainly not making a dent on the LLM part of the pie.


Replies

lukeschlatheryesterday at 12:07 AM

Sam Altman has made a lot of grandiose claims about how much power he's going to need to scale LLMs, but the evidence seems to suggest the amount of power required to train and operate LLMs is a lot more modest than he would have you believe. (DeepSeek reportedly being trained for just $5M, for example.)

show 1 reply