Great news. Now maybe we can go on the offense for once. Work to enable constitutional protections against this sort of thing, and develop systems that can work around it if and when this comes back again.
There are places in the world today where only sneakernet communication has any semblance of privacy, so we need non-specialist tools that can provide privacy and secrecy regardless of local conditions. (I’d love to see more communication tools that don’t assume an always-on connection, or low latency, or other first world conditions.)
Many countries have such protections, for instance Germany. They could actually issue arrest warrant for all involved as Chat Control amounts to attempt at terrorism (act of indiscriminate violence for ideological gain) against German people and that is illegal. Problem is that there is widespread apathy and lack of will to act.
[dead]
What we have at the moment is the protect given by the European Convention on Human Rights. The general problem however is that it gives exceptions to law enforcement to infringe on such right, as long the law is "done for a good reason – like national security or public safety." (https://www.coe.int/en/web/impact-convention-human-rights/ri...)
It is fairly well universally claimed by technology experts and legal experts that Chat Control is not effective for its stated purpose. It does not make it easier to find and stop abuse of children, nor does it have any meaningful reduction to the spread of CSAM. This makes the law unnecessary, thus illegal. However it hinges on that interpretation. Law enforcement officials and lobbyists for firms selling technology solutions claims the opposite, and politicians that want to show a strong hand against child exploitation will use/abuse those alternative views in order to push it.
Removing the "done for a good reason" exception will likely be a massive undertaking. Rather than constitutional protections, I think the more likely successful path would be a stronger IT security, cybersecurity regulations and data protection, so that governments and companies carry a larger risk by accessing private data. A scanner that carry a high rate of false positives should be a liability nightmare, not an opportunity for firms to sell a false promise to politicians. Cybersecurity regulations should also dictate that new legislation must not increase risk to citizens. One would assume that to be obvious, but history has sadly shown the opposite with government producing malware and the hording of software vulnerabilities. If there must be exception to privacy, "for good reason", it must not be done at the cost of public safety.