Signal is not built for anonymity. It's built for message privacy. It's a lot like PGP in that the government know who emailed whom, but they cannot read the email. That's the whole point. If you are trying to hide your phone number, Signal is not going to help you and it's not meant to.
PGP doesn't hide metadata, anonymous remailers hide metadata. Add a sufficient volume of dummy messages and all of a sudden nobody can do traffic analysis, either. Think ATM: There's a constant volume of "cells" but only some of them are actually carrying anything.
That, or blasting your message to a huge number of people, only one or a few of whom actually receive it because it's encrypted and then steganographically hidden in spam. Again, use dummy messages and there's no way to predict anything by divining the ebb and flow of spam volumes.
I've never understood the point of privacy without anonymity. Or of plausible deniability. Both depend on rather idealistic assumptions about adversaries.
The practical upshot of Signal's deniable authentication is that a Signal message isn't proof of anything. It has zero weight because everybody can make fake Signal messages apparently from somebody else to them about anything.
If Alice tells Bob a secret via Signal, this means Alice cannot be worse off than if she'd used any other means of telling Bob. Can Bob reveal the secret? Yes. Can he claim Alice told him? Yes. Can he prove it? No.
This is a sharp contrast to something like PGP where Bob can prove Alice sent the message.
That's nice. But choosing to believe nonsense won't make it true. The United States of America chose to believe that torturing people is an effective means of securing reliable intelligence. Because that's how it works in Hollywood movies, so how can reality be different? But of course the "intelligence" they obtained this way was not in fact reliable, because a person being tortured doesn't magically know the truth and you don't know if they're telling the truth, so they'll say whatever they think will make you stop hurting them, which is utterly useless.
The only way you can know if intelligence obtained is reliable is to actually test it. With systems like PGP you get proof. Did Alice send this message as Bob alleges? Yes, the message includes proof so he was telling us the truth.
With Signal all you have is Bob's word as I described.
Signal can't stop the Secret Police from torturing Bob, but they can ensure they don't have any way to know if he told them the truth. If the Secret Police were rational that's enough reason not to bother torturing Bob. But we can't make them rational, for some people just inflicting pain for no reason is their goal.
Nope. Signal's messages are relayed by Signal's servers over IP like anything else, your phone has no evidence this message ever came from anybody's phone, let alone that it was Alice's phone. If you use Signal Desktop it didn't come from a phone at all. Signal doesn't keep any proof that it got these messages from an "authentic" source. Either they check out as from Alice or they don't and in the latter case they clearly shouldn't be displayed at all.
The way you normally know a message is from Alice on Signal is that the message was sent using keys only you and Alice share†, and you know you didn't write the message. But a third party has no way to verify that last part. That's the entire trick (in layman's terms).
† Signal and similar systems provide a means to do out-of-band verification that the long term identity key for people you know matches. You probably don't use this with most people, but you can and it's made easy if you want to.
The vast majority of communications occur between people who are publicly known to have an association and have no need to deny the association. Some common examples:
1. Friends
2. Family members.
3. Members of a business.
If your life or freedom is on the line because of an association with someone then most systems out there are somewhat dangerous due to the weakness of the endpoints. You would want something like an airgapped computer with on or off line dead drops possibly hidden with stenography.
> You would want something like an airgapped computer with on or off line dead drops possibly hidden with stenography.
Well, "the best is the enemy of the good". That's the whole point of risk management. As a practical matter, I do the best that I can manage, or at least, be bothered with ongoingly. If I were as paranoid as you're advocating, I'd be cowering in a bunker. Also, for me there's the fact that I have little left to lose.