Your electricity bill alone could justify the cost of a new computer purchase if you're not shutting that down after every session.
An interesting point. Some random measurement gets 49W idle[1] which is probably close enough. I don't constantly compile stuff or stream video. At my local electricity rate of $0.072/kWh that works out to $31USD/year.
New systems idle at something like 25 Watts according to a lazy search. So 49-25=24W. That works out to $15/year hypothetically saved by going to a newer system. But I live in a cold climate and the heating season is something like half the year. But I only pay something like half as much for gas heat as opposed to electric heat. So let's just knock a quarter off and end up with 15-(15/4)=$11.25USD hypothetically saved per year. I will leave it here as I don't know how much the hypothetical alternative computer would cost and, as already mentioned, I don't care.
[1] https://forums.anandtech.com/threads/athlon-ii-x2-250-vs-ath...
65W TDP? Let's say we want to run a PC so we're switching to a newer low-end Ryzen with a 35W TDP and that that's a 30W difference for the whole system. Let's say we're running the system 24/7 and the CPU is pulling its full TDP constantly. Average US residential electricity price is $0.18/kWh.
0.03 kW * 24 h * 365 d * $0.18 = $47.30/year