this post was submitted on 15 Apr 2024
88 points (94.9% liked)
Privacy
32050 readers
840 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Signal isn't federated. Signal has centralized servers. Signal requires phone number identification to use it. Signal stores your encryption key on their servers.... Relying on sgx enclaves to 'protrct' your encryption key.
Signal can go down. Signal knows who you talk to, just by message timing. Signal knows how frequently you talk to someone. Signal can decrypt your traffic by attack their own sgx enclaves and extracting your encryption key.
These are all possible threats and capabilities. You have to decide what tradeoff makes sense to you. Fwiw I still use signal.
That would surprise me. What's your source for this?
https://signal.org/blog/secure-value-recovery/
So my takeaways from this link and other critiques has been:
1.Signal doesn't upload your messages anywhere, but things like your contacts (e.g. people you know the usernane/identifier, but not phone number of) can get backed up online
2. You can disable this backup and fully avert this issue. (You'll lose registration lock if you do this.)
3. Short PINs should be considered breakable, and if you're on this subreddit you should probably use a relatively long password like BIP39 or some similar randomly assigned mnemonic.
4. SGX should probably also be considered breakable, although this does appear to be an effort to prevent data from leaking.
One nit to pick, messages have to transit through the signal network. And they could be recorded during transit. Carnivore style
True, but that's more or less out of the scope of this thread. I could go on for way longer about centralized versus federated services...
master_key is never stored or sent to the SGX, only c2, the entropy bits. The user's password is still required to generate the key.
Brute forcing 4-6 digit pins is trivial.
And even if the user set a actual password, it's still very trivial
https://blog.cryptographyengineering.com/2020/07/10/a-few-thoughts-about-signals-secure-value-recovery/
"Very trivial" if they set a proper password? Yet the source you provide says it's robustly secure
I can't find the phrase robustly secure in the last link:
https://blog.cryptographyengineering.com/2020/07/10/a-few-thoughts-about-signals-secure-value-recovery/
Signal asks users to set a pin/password which needs to be periodically reentered. This discourages people from using high entropy passwords like BIP38.
The password is literally a pin
If you set a small pin, perhaps. Most people set a password
Pin is the suggested option, so I really doubt "most" of the people choose password
Most people who care* I guess would be more apt
For the people who really care, they can disable The pin. I believe the client will generate a BIP 38 password randomly, and use that for the data encrypted in the SVR. But all the data is still uploaded to the cloud. So if there's a problem with the SVR encoding, the BIP 38 password generation etc the data is still exploited
Not only do you have to care, everyone you talk to has to do the same thing, because if your counterparty has their key in the cloud, the conversation is at risk.
Regardless, the master key is never uploaded
All the bits to reconstruct the master key minus the pin code are uploaded. So it's equivalent to uploading to the cloud The master key itself.
Very few people are using BIP 38 level passwords. So the vast majority of people have their key constructively uploaded fully in the cloud
Many assertions without any proof. Could you at least point out the sources for such statements?
https://github.com/dessalines/essays/blob/main/why_not_signal.md
Also, most of the points of the message you replied to are abstract and don't need any citation. Like do you want source for signal being centralized or for signal having ability to track you?
Everything in that post makes perfect sense; the proof is in knowing how these systems work, Signal's source code, and details from Signal themselves. I can go into more detail on each point when I'm at a computer; my phone kills processes in a few seconds when I try to multitask which makes it nearly impossible to write long posts on mobile if I have to go back and forth to copy and paste. Is there any claim in particular you want details on as to why it's reasonable, or shall I just do the lot? Edit: Ah, OP got it, nevermind!
Also, I should point out that I use Signal pretty much exclusively for messaging. This isn't hate, I'm just aware of its weaknesses.
excuse me what? signal can extract your encryption key how exactly?
They have your key In a SGX enclave. You only need to look at the rich history of side channel attacks, known SGX critical vulnerabilities, or just the fact that Intel can sign arbitrary code, which can run in the enclave, which means they can be compelled to with the cooperation of the government
https://dl.acm.org/doi/fullHtml/10.1145/3456631
https://nvd.nist.gov/vuln/search/results?form_type=Basic&results_type=overview&query=SGX&search_type=all&isCpeNameSearch=false
I'm not saying they do, but they have the capability, which needs to be accounted for in your threat model.
At the end of the day, people are entrusting their encryption keys with the signal foundation to be stored in the cloud. That needs to be part of the threat model.
i read some of your other comments too. this is insane. I've always hated signal but this is another reason on top. No wonder the CIA funded them for 10 years.
Read the post by signal. Note the use of the word "plaintext".
Whenever someone qualifies a statement like this, without clarifying, it's clear they're trying to obfuscate something.
I don't need to dig into the technical details to know it's not as secure as they like to present themselves.
Thanks. I didn't realize they were so disingenuous. This also explains why they stopped supporting SMS - it didn't transit their servers (they'd have to add code to capture SMS, which people would notice).
They now seem like a honeypot.
They are very much not. Anyone who tells you this is a state influencer or someone who believed a state influencer.
Saying something has the capabilities of a honeypot, is the correct thing to do when we're assessing our threat model.
Is it a honey pot? I don't know. It's unknowable. We have to acknowledge the the actual capabilities of the software as written and the data flows and the organizational realities.
My concern is people stay away from Signal in favor of unencrypted privacy nightmares. It happened with DDG a while back where I knew people who used Google because DDG had privacy issues. It sounds dumb but it is a true story.
Sure. I still encourage people to use signal. Most people don't have a threat model that makes the honey pot scenario a viable threat. In this thread we are talking about its downsides, which is healthy to do from time to time. Acknowledging capabilities is a good exercise.
Signal is still secure. If it wasn't it wouldn't be used in Military applications.
Secure within the context of a certain threat model.
The French government does not endorse signal for government communication as an example
And I would highly suspect the Russian government would not use signal either.
I cite both of these as examples of threat models that can't ignore some of the potential capability of the signal.
In the US government organizations are trying to protect themselves from each other and themselves. (Its messy)
Not to say that Signal is perfect (its not) but if the DoD recommends it and has guidance on how to harden it then it can't be to bad.