logoalt Hacker News

pembrookyesterday at 10:07 PM1 replyview on HN

My guess is that's off by a bit, but sure let's assume that's true.

Now measure the amount of electricity the same prompt will use in 6 years when both algorithmic efficiency and 3-4 generations of silicon lower that by 95% (or more).

Will your microwave become 95% more efficient over the next 6 years? No.

Also how many video prompts will the average person run in a given year? Almost certainly 0. I heavily use AI daily and have probably played with AI video less than 4 times, ever.

Yet certainly the average person will use 20,000-100,000 microwave minutes over their lifetime. I use my microwave for 2-3 minutes every day at lunch for example.

From first principles, the idea that electricity use = bad is wrong. If your electricity comes from burning coal or lignite, then obviously yes using that electricity has bad externalities.

But a french person running their microwave on Nuclear powered grids? This is good. Dirty energy sources is the problem. Not energy use itself.


Replies

adrryesterday at 10:32 PM

Are these companies going toss a $500b+ infrastructure investment away in next 6 years? Whats the average lifespan of a AI compute node?

show 1 reply