> to disprove a claim made by uninformed people
KDE developer wrote recently:
> X11 isn’t able to perform up to the standards of what people expect today with respect to .., 10 bits-per-color monitors,.. multi-monitor setups (especially with mixed DPIs or refresh rates),... [1]
Multi-monitor setups are working since 20+ years. 10 bits are also supported (otherwise how would the PRO versions of graphic cards support this feature).
> chose to put its energy into bringing it all together in
I cannot recall, was there any paper analyzing why working and almost working X11 features do not fit, few additional X11 extensions cannot be proposed anymore and another solution from scratch is inevitable. What is a significant difference of a X11 and a wayland protocol extension.
[1] https://pointieststick.com/2025/06/21/about-plasmas-x11-sess...
Multi monitor with mixed DPIs absolutely does not work well in x11 in 2025. I don’t know about 20+ years ago.
Ironically, when I tried to set my X11 to 10bpp, KDE was the main thing that shat itself while other (i.e. OpenGL-based) programs rendered correctly.
Nate (the author of the blog post you linked), who I know personally very well, is a QA/product person focused on integration and fit and finish issues. What he means to say is that as a polished product, this is now available in the form of a Wayland-based desktop session without fiddling, while the same cannot be said of X11-based ones. It's meant as a pragmatic take, not as a history lesson.
That's quite similar to how I chose to phrase is, and comes down to where the community chose to spend the effort to solve all the integration issues to make it so.
Did the community decide that after a long soul-seeking process that ended with a conclusion that things were impossible to make happen in X11, and does that paper you invoke exist? No, not really. Conversations like this certainly did take place, but I would say more in informal settings, e.g. discussions on lists and at places like the X.org conference. Plenty of "Does it make sense to that in X11 still or do we start over?" chatter in both back in the day.
If I recall right, the most serious effort was a couple of people taking a few weeks to entertain a "Could we fix this in an X12 and how much would that break?" scenario. Digging up the old fdo wiki pages on that one would for sure be interesting for the history books.
The most close analogue I can think of that most in the HN audience are familiar with is probably the Python 2->3 transition and decision to clean thing up at the expense of backward compat. To this day, you will of course find folks arguing emotionally on either side of the Python argument as well.
For the most part, the story of how this happened is a bit simpler: It used to be that the most used X11 display server was a huge monolith that did many things the kernel would not, all the way to crazy things like managing PCI bus access in user space.
This slowly changed over the years, with strengthening kernel infra like DRM, the appearance of Kernel Mode Setting, with the evolution of libraries like Mesa. Suddenly implementing a display server became a much simpler affair that mostly could call into a bunch of stuff elsewhere.
This created an opening for a new smaller project fully focused on the wire protocol and protocol semantics part, throwing away a lot of old baggage and code. Someone took the time to do that and demonstrate how it looks like, other people liked what they saw and Wayland was born.
This also means: Plenty of the useful code of the X11 era actually still exists. One of the biggest myths is that Wayland somehow started over from scratch. A lot of the aforementioned stuff that over the years migrated from the X11 server to e.g. the kernel is obviously still what makes things work now, and libraries such as libinput, xkbcommon that nearly every Wayland display server implementation uses are likewise factored out of the X11 stack.