I like how the article describes how certificates work for both client and server. I know a little bit about it but what I read helps to reinforce what I already know and it taught me something new. I appreciate it when someone takes the time to explain things like this.
Why did LE make this change? It feels like a rather deliberate attack on the decentralised web.
Is there a reason why dialback isn't the answer?
I would think it's more secure than clientAuth certs because if an attacker gets a misissued cert they'd have to actually execute a MitM attack to use it. In contrast, with a misissued clientAuth cert they can just connect to the server and present it.
Another fun fact: the Mozilla root store, which I'd guess the vast majority of XMPP servers are using as their trust store, has ZERO rules governing clientAuth issuance[1]. CAs are allowed to issue clientAuth-only certificates under a technically-constrained non-TLS sub CA to anyone they want without any validation (as long as the check clears ;-). It has never been secure to accept the clientAuth EKU when using the Mozilla root store.
[1] https://www.mozilla.org/en-US/about/governance/policies/secu...
Shame LE didn't give people option to generate client and client+server auth certs
For those wondering if ejabberd Debian systems will be impacted, it seems like for now there no fix, the issue is being tracked here: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1127369
I wonder if issues like this couldn't be a use case for DANE.
Client authentication with publicly-trusted (i.e. chaining to roots in one of the major 4 or 5 trust-store programs) is bad. It doesn't actually authenticate anything at all, and never has.
No-one that uses it is authenticating anything more than the other party has an internet connection and the ability, perhaps, to read. No part of the Subject DN or SAN is checked. It's just that it's 'easy' to rely on an existing trust-store rather than implement something secure using private PKI.
Some providers who 'require' public TLS certs for mTLS even specify specific products and CAs (OV, EV from specific CAs) not realising that both the CAs and the roots are going to rotate more frequently in future.
I feel like using web pki for client authentication doesn't really make sense in the first place. How do you verify the common name/subject alt name actually matches when using a client cert.
Using web pki for client certs seems like a recipe for disaster. Where servers would just verify they are signed but since anyone can sign then anyone can spoof.
And this isn't just hypothetical. I remember xmlsec (a library for validating xml signature, primarily saml) used to use web pki for signature validation in addition to specified cert, which resulted in lot SAML bypasses where you could pass validation by signing the SAML response with any certificate from lets encrypt including the attackers.
From https://letsencrypt.org/2025/05/14/ending-tls-client-authent...
"This change is prompted by changes to Google Chrome’s root program requirements, which impose a June 2026 deadline to split TLS Client and Server Authentication into separate PKIs. Many uses of client authentication are better served by a private certificate authority, and so Let’s Encrypt is discontinuing support for TLS Client Authentication ahead of this deadline."
TL;DR blame Google
I really fail to understand or sympathize with Let's Encrypt limiting their certs so. What is gained by slamming the door on other applications than servers being able to get certs?
In this case I do think it makes sense for servers to accept certs even as marked by servers, since it's for a s2s use case. But this just feels like such an unnecessary clamping down. To have made certs finally plentiful, & available for use... Then to take that away? Bother!
Is there any reason why things gravitate towards being web-centric, especially Google-centric? Seeing that Google's browser policies triggered the LE change and the fact that most CAs are really just focusing on what websites need rather than non-web services isn't helpful considering that browsers now are terribly inefficient (I mean come on, 1GB of RAM for 3 tabs of Firefox whilst still buffering?!) yet XMPP is significantly more lightweight and yet more featureful compared to say Discord.
Prosody is also the base of Snikket[1], a popular recent XMPP server. Snikket is basically just a Prosody config.[2]
[1] https://snikket.org/service/quickstart/
[2] https://github.com/snikket-im/snikket-server/blob/master/ans...