> buying a piece of shrink-wrapped software and using it for the next 18 years
I'm wondering how that works. I have written software that was still being used, 25 years later, but it was pretty much a "Ship of Theseus," by then.
Old hardware or emulation of old operating systems on new hardware.
Quite common on old industrial machinery and other capital equipment like lab equipment. San Francisco BART for example has to scrounge eBay for old motherboards that still allow DMA to parallel ports via southbridge because it’s too expensive to validate a new design for controllers.
Not all software can be sufficiently insulated from external changes, but almost all software I care about can be. My normal update cadence is every 2-3 years, and that's only because of a quirk in my package manager making it annoying for shiny new tools to coexist with tools requiring old dependencies. The most important software I use hasn't changed in a decade (i.e., those updates were no-ops), save for me updating some configurations and user scripts once in awhile. I imagine that if I were older the 18yr effective-update-cycle would happen naturally as well.
My gut reaction is that the software you're describing relies heavily on external integrations. Is that correct?