logoalt Hacker News

jeffbeeyesterday at 4:18 PM1 replyview on HN

Seems to be a mental mishmash. For one thing, they are taking it as given that temperature is relevant to device lifetime, but Google's FAST 2007 paper said "higher temperatures are not associated with higher failure rates".

Second weird thing is that it says cooling accounts for 40% of data center power usage, but this comes right after discussing PUE without contextualizing PUE with concrete numbers. State-of-the-art PUE is below 1.1. The article then links to a pretty flimsy source that actually says server loads are 40% ... this implies a PUE of 2.5. That could be true for global IT loads including small commercial server rooms, but it hardly seems relevant when discussing new builds of large facilities.

Finally, it's irritating when these articles are grounded in equivalents of American homes. The fact is that a home just doesn't use a lot of energy, so it's a silly unit of measure. These figures should be based on something that actually uses energy, like cars or aircraft or something.


Replies

dijityesterday at 4:36 PM

> Seems to be a mental mishmash. For one thing, they are taking it as given that temperature is relevant to device lifetime, but Google's FAST 2007 paper said "higher temperatures are not associated with higher failure rates".

Google have been wrong a couple of times, and this is one area where I think what they've said (18 years ago btw) might have had some time to meet the rubber of reality a bit more.

Google also famously chose to disavow ECC as mandatory[0] but then quietly changed course[1].

In fact, even within the field of memory: higher temperatures cause more errors[2], and voltage leaking is more common at higher temperatures within dense lithographic electronics (memory controllers, CPUs)[3].

Regardless: thermal expansion and contraction will cause degradation of basically any material that I can think of, so if you can utilise the machines 100% consistently and maintain a solid temperature then maybe the hardware doesn't age as aggressively as our desktop PCs that play games- assuming there's no voltage leaking going on to crash things.

[0]: https://www.cs.toronto.edu/~bianca/papers/sigmetrics09.pdf

[1]: https://news.ycombinator.com/item?id=14206811

[2]: https://dramsec.ethz.ch/papers/mathur-dramsec22.pdf

[3]: https://www.researchgate.net/publication/271300947_Analysis_...

show 1 reply