logoalt Hacker News

skrebbeltoday at 4:59 PM0 repliesview on HN

In large C++ codebases of mediocre quality (the example I'm referring to is a manufacturer of large complex machines), yes.

People would compile their local unit locally, of course (a "unit" would be a bunch of files grouped together in some hopefully-logical way). But they wouldn't be 100% sure it compiled correctly when integrated with the larger codebase until the nightly build ran. So like if you didn't change the .h files you were pretty sure to be in the clear, but if you did, you had to be careful and worse-case-scenario do a 1-day-per-step edit-compile-test loop for a week or so. I'm not entirely sure how they managed to keep these compile failures from hurting other teams, but they didn't always (I think they had some sort of a layered build server setup, not too dissimilar from how GH Actions can do nightlies of a "what if this PR were merged with main now").

Visual Studio 6 itself was pretty OK actually. Like the UI was very limited (but therefore also fast enough), but compiling smallish projects went fine. In fact it was known to be a pretty fast compiler, I didn't mean to suggest that VC++6 implies overnight builds. They just coincided. In fact better-structured big-ish C++ projects (pimpl pattern anyone?) could probably recompile pretty quickly on the computers of the day.