There's a fallacy often repeated for computers: "It's lasted a long time so it's going to keep lasting a long time." The thing is, failure of computer hardware is often due to manufacturing flaws. There's many that could have flaws, and they're subject to (varying) environmental stresses (both at build time at run time), so there's many failure modes.
It's difficult to know exactly when a server might fail. It might be within 1 month of its build, it might be 50 years. But what's clear is that failure isn't less likely as the machine gets older, it's more likely. There are outliers, but they;re rare. The failure modes for these things are well recorded, and the whole thing is designed to fail within a certain number of hours (if it's not the hard drive, it's the fan, the cpu, the memory, the capacitors, the solder joints, etc). It doesn't get better as it ages.
But environmental stress is often a predictor of how long it lives. If the machine is cooled properly, in a low-humidity environment, is jostled less, run at low-capacity (fans not running as hard, temperature not as high, disks not written to as much, etc), then it lives longer. So you can decrease the probability of failure, and it may live longer. But it also might drop dead tomorrow, because again there may be manufacturing flaws.
If given the choice, I wouldn't buy an old machine, because I don't know what kind of stress it's had, and the math is stacked against it.
> But what's clear is that failure isn't less likely as the machine gets older, it's more likely.
Is this true? Doesn't most hardware have a dip in failure rate in the middle of its average lifespan?