logoalt Hacker News

belornyesterday at 1:09 AM1 replyview on HN

What we have at the moment is the protect given by the European Convention on Human Rights. The general problem however is that it gives exceptions to law enforcement to infringe on such right, as long the law is "done for a good reason – like national security or public safety." (https://www.coe.int/en/web/impact-convention-human-rights/ri...)

It is fairly well universally claimed by technology experts and legal experts that Chat Control is not effective for its stated purpose. It does not make it easier to find and stop abuse of children, nor does it have any meaningful reduction to the spread of CSAM. This makes the law unnecessary, thus illegal. However it hinges on that interpretation. Law enforcement officials and lobbyists for firms selling technology solutions claims the opposite, and politicians that want to show a strong hand against child exploitation will use/abuse those alternative views in order to push it.

Removing the "done for a good reason" exception will likely be a massive undertaking. Rather than constitutional protections, I think the more likely successful path would be a stronger IT security, cybersecurity regulations and data protection, so that governments and companies carry a larger risk by accessing private data. A scanner that carry a high rate of false positives should be a liability nightmare, not an opportunity for firms to sell a false promise to politicians. Cybersecurity regulations should also dictate that new legislation must not increase risk to citizens. One would assume that to be obvious, but history has sadly shown the opposite with government producing malware and the hording of software vulnerabilities. If there must be exception to privacy, "for good reason", it must not be done at the cost of public safety.


Replies

hrimfaxiyesterday at 1:24 AM

> Rather than constitutional protections, I think the more likely successful path would be a stronger IT security, cybersecurity regulations and data protection, so that governments and companies carry a larger risk by accessing private data. A scanner that carry a high rate of false positives should be a liability nightmare, not an opportunity for firms to sell a false promise to politicians.

Technological means are forever vulnerable to social means. Governments can compel what technology prohibits. Technology won't stop politicians from passing legislation to ban privacy.