logoalt Hacker News

withzombiestoday at 2:00 PM10 repliesview on HN

Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.


Replies

cogman10today at 2:07 PM

There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.

So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.

Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.

show 3 replies
unclad5968today at 2:03 PM

Well there are still some c++20 items that aren't fully supported, at least according to cppref.

https://en.cppreference.com/w/cpp/compiler_support/20.html

show 1 reply
andsoitistoday at 2:08 PM

> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html

dagmxtoday at 3:31 PM

This is about changing the default.

The issue with defaults is that people have projects that implicitly expect the default to be static.

So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.

show 4 replies
BeetleBtoday at 4:35 PM

> What is the downside of switching to the newest standard when it's properly supported?

"Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.

show 1 reply
1718627440today at 2:08 PM

> What is the downside of switching to the newest standard when it's properly supported?

They are discussing in this email thread whether it is already properly supported.

> It's one reason why people care so much about self-hosted compilers

For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.

binary132today at 3:28 PM

A lot of software, and thus build automation, will break due to certain features that become warnings or outright errors in new versions of C++. It may or may not be a lot of work to change that, and it may or may not even be possible in some cases. We would all like there to be unlimited developer time, but in real life software needs a maintainer.

show 1 reply
superkuhtoday at 2:43 PM

When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.

I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.

show 2 replies
ajrosstoday at 2:33 PM

> What is the downside of switching to the newest standard when it's properly supported?

Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.

[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.

[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.

[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.

show 1 reply
hulitutoday at 5:02 PM

> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

cursing because the old program does not compile anymore No.

show 1 reply