Maybe via metadata? The size of the information, etc. Do you mean that they should have a caveat about that?
Or if you want to be literal, you have to say that they're storing sensitive information even if it's encrypted. But by connotation that phrase implies that someone other than the user could conceivably have access to it. So for all any user could care, they just as well are not storing it. Do you mean that they should rephrase it so it's literally correct?
Or do you mean that it's actually bad for them to be collecting safely encrypted sensitive data? Because if so, you literally cannot accept any encrypted messenger because 3rd parties will always have access to it.
Yes, I think they should rephrase it so that it's literally correct. Personally, I have a very high trust in the safety of Signal's encryption and security practices. But privacy policies aren't for the Signals of the world, they're for the ad networks and sketchy providers. For example, many ad networks collect "Safely Encrypted" email addresses—but still are able to use that information to connect your Google search result ad clicks with your buying decisions on Walmart.com. Whether something is "safely" encrypted is a complicated, contextual decision based on your threat model, the design of the secure system in question, key custody, and lots of other complicated factors that should each be disclosed and explained, so that third parties can assess a service's data security practices. Signal is a great example of a service that does an excellent job explaining and disclosing this information, but the fact that their privacy policy contradicts their public docs lessens the value of privacy policies.