Did people really only compile once a night in the days of visual studio 6? There were pentium 2s and 3s back then.
It was definitely on the order of hours for large code bases - the Microsoft Excel team passed out punishment “suckers” for those who broke the build - causing 100+ people to not have a new working build to look at and test.
Linux kernel compiles in the 1990s were measured in hours, and that codebase was tiny compared to many. So, yep, builds were slow, slow enough to have an entire xkcd comic written about them.
In large C++ codebases of mediocre quality (the example I'm referring to is a manufacturer of large complex machines), yes.
People would compile their local unit locally, of course (a "unit" would be a bunch of files grouped together in some hopefully-logical way). But they wouldn't be 100% sure it compiled correctly when integrated with the larger codebase until the nightly build ran. So like if you didn't change the .h files you were pretty sure to be in the clear, but if you did, you had to be careful and worse-case-scenario do a 1-day-per-step edit-compile-test loop for a week or so. I'm not entirely sure how they managed to keep these compile failures from hurting other teams, but they didn't always (I think they had some sort of a layered build server setup, not too dissimilar from how GH Actions can do nightlies of a "what if this PR were merged with main now").
Visual Studio 6 itself was pretty OK actually. Like the UI was very limited (but therefore also fast enough), but compiling smallish projects went fine. In fact it was known to be a pretty fast compiler, I didn't mean to suggest that VC++6 implies overnight builds. They just coincided. In fact better-structured big-ish C++ projects (pimpl pattern anyone?) could probably recompile pretty quickly on the computers of the day.