I see a link to a forum where an anonymous participant says
“Since a recent version of Signal data of all Signal users is uploaded to Signal’s servers. This includes your profile name and photo, and a list of all your Signal-contacts.”
They then link to a Signal blog (2019) explaining technical measures they were testing to provide verifiably tamperproof remote storage.
https://signal.org/blog/secure-value-recovery/
I’m not equipped to assess the cryptographic integrity of their claims, but 1) it sounds like you’re saying that they deployed this technology at scale, and 2) do you have a basis to suggest it’s “not-very-secure or likely backdoored,” in response to their apparently thoughtful and transparent engineering to ensure otherwise?
> 2) do you have a basis to suggest it’s “not-very-secure or likely backdoored,” in response to their apparently thoughtful and transparent engineering to ensure otherwise?
The forum post explains this:
> This data is encrypted by a PIN only the user can know, however users are allowed to create their own very short numeric PIN (4 digits). By itself this does not protect data from being decrypted by brute force. The fact that a slow decryption algorithm must be used, is not enough to mitigate this concern, the algorithm is not slow enough to make brute forcing really difficult. The promise is that Signal keeps tge data secured on their servers within a secure enclave. This allows anyone to verify that no data is taken out of the server, also not by the Dignal developers themselfs, not even if they get a subpoena. At least that is the idea.
> It is also not clear if a subpoena can force Signal to quietly hand over information which was meant to stay within this secure enclave.
That should be very concerning for activists/journalists who use Signal to maintain privacy from their government. Subpoena + gag order means the data is in the hands of the government, presuming Signal want to keep offering their services to the population of the country in question.
The communication Signal put out was extremely confusing and unclear which caused a lot of issues. They avoided answering questions about the data being collected and instead focused everything on SVR (see https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...)
The problems with the security of Signal's new data collection scheme was talked about at the time:
https://web.archive.org/web/20210126201848mp_/https://palant...
https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...
You'll have to decide for yourself how secure pins and enclaves are, but even if you thought they were able to provide near-perfect security I would argue that outright lying to highly vulnerable users by saying "Signal is designed to never collect or store any sensitive information." on line one of their privacy policy page is inexcusable and not something you should tolerate in an application that depends on trust.