Signal seems to be a lone island in its dedication to user privacy. Is this naive or an accurate evaluation? Have they done anything recently to signal that these may falter? What is going to suggest they won't resort to rent-seeking or monetizing in the future?
Very recently they added their first closed-source component to their code, supposedly a spam filter. This sort of went by HN. I can't help but think that the timing with all these requests is rather coincidental.
Their server code was closed source for over a year with no explanation, turns out it was to add a shitcoin called "MobileCoin" that Signal's founder has a financial interest in.
Ever since that, I don't think you should consider Signal's server "open source" at all. They've shown they'll conceal, diverge and cover up for their own enrichment. What will they do when they're threatened? Cave.
>If there's a single piece of closed source code running on the servers, it ceases to be open source.
Yeah no that's not the definition of opensource, if my opensource mailserver has a closed source spam-scanner, my mail server is still opensource - the spam-scanner
My understanding is that this component was added to their server code. If their clients are still verifyiably encrypting the content and limiting the metadata for your messages, it should not be much of an issue.
Don't really understand the cries for their server code to be constantly updated or even open sourced, you can't ever verify that's what they are running anyway.
Maybe there's some sort of cryptographic attestation out there which could fulfil such purposes but quite sure it's not that practical.
That's true and it's one reason I'm not too comfortable with Signal's access to metadata (even though they are the best nonfederated communicator in that regard).
There are a few reasons why I would prefer them to provide source code that they claim is running in the service due to the metadata issue:
a) if it's actually running there, people can find simple bugs in it that could allow that metadata to be stored or revealed by accident,
b) if it's not actually running there, but something very close is (i.e. that code with small amount of patches), then the advantage above still applies and if those patches come to light, they can be easily evaluated for intent and effect,
c) if they're running something completely different (which would be very weird), it'd be noticeable and it would be an obvious lie once exposed.
Every time you send a message with Signal, you inform the server which user the message is for. Thus, the server can remember (sending IP, target phone number) pairs and, for similar reasons, (phone number, IP of the user when retrieving messages) pairs. We rely on the server to discard this information; if not discarded, it reveals the social graph and the phone number<->IPs used mapping.
And since Snowden we know that meta-information is as important as information. So Signal being E2E encrypted goes only halfway towards ensuring the privacy of our communications.
Matrix looks the most promising to me. I believe the people behind it are principled (just like I think the people behind Signal are), but the Matrix protocol doesn't require heros. There's no single point of technical or legal failure.
What I mean is that Signal as a messaging service depends completely on Signal as an organisation and on the few individuals running that organisation. Everything hinges on them.
It's pretty clear that governments around the world (including many western countries) have decided that online services must take full responsibility for everything their users say and do on the platform. Regulation is coming hard and fast. It's not subtle and some of it will inevitably be inconsistent with E2EE.
Anyone running a popular service of this sort is going to be under huge pressure - political pressure, expensive legal pressure and presumably the sort of pressure exerted by police and intelligence services that the rest of us are fortunate not to know very much about.
Signal's structure, the tight link between the org, the people and the centralised operations focuses all the pressure on this small group of people. They are going to be an absolute lightning rod as soon as WhatsApp has given up.
I don't want to be in their shoes. I fear they will have to make very difficult decisions pretty soon. I find it puzzling that they're making themselves the target of further powerful opponents by adding cryptocurrencies to the mix.
I think Matrix has the better structure for what's coming. It distributes the pressure. It's not either on or off. It's not just some specific app that can be banned.
Politicians are under pressure to "do something" about services like Facebook, Instagram, Twitter, TikTok, Youtube, Reddit ... services which publish what some of their users do and say, and enjoy broad immunity from lawsuits despite choosing what is communicated and by who. None of which applies to Signal.
Your theory is that, seeing this, US politicians will decide priority #1 is to re-write the US constitution in order to go after a completely different service? Sure, they can't find a bare majority of votes for cheap, easy fixes that are broadly popular, but repeatedly getting super-majorities for the lengthy and expensive process of tearing down the fundamental principles of the country to go after an outfit few voters have even heard of it will be easy? I don't buy it.
One of the most awful things about the "better structure" of Matrix is that if you make any use of this "better structure" you eliminate Don't Stand Out which is an essential characteristic of Signal. Yet, to enable that "better structure" to even exist Matrix already makes you give up some of the privacy you have on Signal anyway. So you're losing some privacy to have the option of giving up even more privacy.
Several of the sub-threads in this discussion mention metadata. Don't Stand Out is crucial because of metadata. The Secret Police can raid all sixteen users of "Jim's Black Pill Matrix Server" and if two of them are innocent bystanders too bad, all fourteen members of the Conspiracy To Do A Naughty Thing are known to use Jim's server, and so they Stand Out and were caught. In contrast, using Signal the Conspiracy To Do A Naughty Thing are disguised by the presence of Sarah's Hen Night Planning DO NOT TELL SARAH, Smith Family Group, and LOL Funny Cats!!!! among many more. Signal doesn't know, or care about any of these groups.
I don't think it's realistic to assume that there will be a very large number of tiny Matrix servers that stand out. It's far more likely that there will be a couple of big ones and some midsize ones that most people use. Each of them can obviously become a target, but they will be in various different jurisdictions and it's easy to create new ones linked to all the rest of them.
If you think that E2EE is not under heavy political fire then please read what our governments are demanding from tech companies. This is not limited to companies that are "choosing what is communicated and by who". On the contrary, governments are demanding that companies monitor what is communicated and by who:
International statement: End-to-end encryption and public safety
That "International Statement" basically says "We wish facts weren't true". But, like, too bad they are. Notice that it doesn't say they acknowledge the reality and so, fuck it, they'll just violate Human Rights and "materially weaken or limit security systems", instead they say well, surely we can avoid that and yet still get the same outcome. Nope. That's not how facts work.
Writing long letters deploring the rules might work if you're up against the resident's association, local town council, or even a court of law, but Mother Nature couldn't give a shit what you think about her rules.
I can't agree with this one. I enjoy Apple products and consider them to be of high quality, but they should not be the star example for user privacy. iCloud backups are turned on by default, which means they have access to the private key, and they considered adding a client-side detection system for "child pornography". From the standpoint of E2EE, enough is enough with the "gray area" in privacy systems. Either you don't fiddle/introspect with the content that goes through your communication medium, or you do.
>they should not be the star example for user privacy
They are not perfect, but I can't see anyone who does better overall.
>iCloud backups are turned on by default
I believe it is good default, as 90%+ of their users (would) care more about accidental data loss than privacy. When that changes, Apple will change the default.
>client-side detection system for "child pornography"
Yes, that's a very controversial move. But they made it client-side because it's more private than server-side.
> I believe it is good default, as 90%+ of their users (would) care more about accidental data loss than privacy.
I think you're right about the numbers, but I don't understand why that justifies the default. Why even have a default? Why not just ask users what they want? It's not a hard question, and it even has the benefit of helping inform users about the feature.
> Yes, that's a very controversial move. But they made it client-side because it's more private than server-side.
So? Either it respects privacy or it doesn't in this kind of discussion. Lauding them for violating your privacy but not as much as they could otherwise is like lauding a mugger for only taking half of the cash in your wallet. Yeah it could have been worse, but that doesn't change the nature of their actions.
This is a usability stance. Removing the ability to use phone numbers would have made it harder for users to adopt Signal, and I find it to be a reasonable trade-off. At least right now, the only thing prying eyes can do is know that you use Signal due to the login SMS, or hijack your SMS. We all know about some terrible security practices that phone providers have been doing with phone number swapping by unauthorized parties, but if we can pick non-awful phone providers which at least have password-protected SIM locks on the account, it shouldn't be as viable.
I've never seen anyone ask for Signal to remove the ability to use phone numbers. People ask Signal to add the ability to use something other than a phone number, as well.
As I've heard it, it's motivated by security rather than usability: If two people use Signal to communicate with each other, then they should use the Signal protocol rather than SMS, without having to go through any extra steps.
There are different ways to achieve that, all of which have drawbacks. They chose one.
Signal has demonstrated to me that they don't give a shit about privacy, but they care a lot about security. I'll take it, but I wouldn't confuse the two.
I also don't know how they know this relates to a request from Luxembourg, but (not being a lawyer) I guess you can at least tell it's from another government since it refers to 18 USC § 3512 which is Foreign requests for assistance in criminal investigations and prosecutions.
As an MP in Luxembourg I asked the gov’ for confirmation and they confirmed that they were at the origin of the request. Currently I’m waiting on more details as to how many requests our tiny nation made to different messenger services abroad.
I have often thought that Luxembourg is a wonderful country -- I sang for the Grand Duke on a choir tour as a student and was bowled over by how much history, excellent food and fantastic wine there was in a small, civilised place (where you can get entry to every tourist attraction in the country for €50!)
The fact that a Luxembourgish MP is on Hacker News commenting intelligently on a topic in a way that I would frankly do myself just makes me want to emigrate.
Signal has access to more data than they would like you to believe. Each account at Signal is identified by a phone number or an UUID. This account also contains a [list of devices](https://github.com/signalapp/Signal-Server/blob/c21eb6aa5098...). Each of the devices has some uinteresting metadata assigned to it, but more importantly it links to a specific Android or iOS device through the [push token](https://github.com/signalapp/Signal-Server/blob/c21eb6aa5098...). This gives the authorities a link to an account at Google or Apple which then of course contains much more data.
Yeah for those of us who remember it, what Signal is doing now is what TPB were doing many years ago.
Of course TPB were also enabling the sharing of masses of copyrighted material. But what Signal is doing, in the eyes of police and intelligence community, is almost the same.
I honestly see a bleak future where the development of secure applications is forced underground. And the next step would be total control of the internet so no anonymous networks can exist. And after that we'll just use radio, so really there is no way to stop people wanting to be private.
Because the Stasi searched for pirate radio stations with vans.
Eh, if they do they’ll eventually discover some naughty persons have made a website that looks like a foreign bank right up until you log in with the appropriate username and password, at which point it turns out to be a naughty chat forum.
And no way to break the https encryption in advance without also making all online banking vulnerable to hackers.
Perhaps they’ll try to get around that by van Eck phreaking everyone’s phone/tablet/vr/computer screens. That will explode in the faces of both law enforcement and politicians when the fantasies in table 2 of this paper have official faces attached to them [link is research paper titled “What Exactly Is an Unusual Sexual Fantasy?”]: https://oraprdnt.uqtr.uquebec.ca/pls/public/docs/FWG/GSC/Pub...
(Before anyone thinks of the obvious fantasy on that list “good, I don’t want such people in power!”, the egg will still be very much on the face).
And last I checked, unlawful drug use is sufficiently common[0] that actually trying to enforce it fairly would bankrupt the nation[0], so same applies to other types of naughty.
Let's see. In many countries Signal is now the primary communication tool for politicians. They love to communicate securely, without the risk of getting caught by law enforcement ;)
That's easy to solve though. They could easily write exceptions to any anti-encryption bill for public orgs who have access to sensitive information (police, politicians, etc). In fact, I would be shocked if they didn't make exceptions for certain parts of the government.
Sure, but they still need to get an app somewhere. And also all the people they want to communicate with need that app. And that are not just other politicians, with them they can anyway communicate face to face in meetings. But politicians can't meet up with shady people. If they meet up with CEOs it's always on some record. But with Signal they can just text with them in secret.
Signal is there. And you can also install it on private devices, that's what politicians like even more. That even their own intelligence agency can't read the messages.
A lot of politicians are corrupt, and they don't like laws that make it harder for them.
Don’t tell the Luxembourgish government, they still use Whatsapp (seriously) for communication (some also use Signal et al.) but not a majority. And I personally use it amongst other things to communicate with whistleblowers ;)
This was the beginning. I know from corruption cases in Austria, that a lot of politicians were using Signal. You can find it in the documents of the prosecution.
up to now, the US has taken a softer approach (unlike Russia or China) of trying to quietly convince companies to set up backdoors. Outright banning E2EE is a losing proposition, since it's fundamentally not possible. Far better to have a root password and corrupt the whole system than to publicly take it down.
In my books this is worse. It gives users a false sense of security and allows government to hide it's intensions. The safe face by supporting free speech only in principal. At least with China and my home country India the intensions are clear, you have very few rights.
in theory yes but in practice your rights to privacy stop the moment you're on the radar of LEA's. not because of the laws but because of companies / vendors over-enthusiastic and pro-active efforts to collaborate.
one of my first jobs working as a dev for what was back then the largest email provider in Germany was a system that would automatically extract all email correspondence from user-accounts that were demanded (via fax) and fax it back to them. we received only a few per week and it was a manual process at LEA's s. there was no law that required this to be automated and we could have done this manually but we wanted it to scale so that we could serve the same answers not per week but per second if we had to (at least that's how I implemented it because I was young, and keen and not thinking). I have no idea how many requests they answer today but if I'd have to guess it would be a lot more because these processes today are also automated at the LEA. It's no longer a deputy that prints it out and manually sends a fax.
also LEO's are just people. there are plenty of cases where they are totally happy to use the system to their advantage, and well beyond making speeding tickets disappear.
Indeed, absolutely. I’m only saying I think Signal is unlikely to be made unlawful for failing to make things easy for LEAs, not that people don’t do as you say.
unlikely because the 0day / offsec market ensures continued access for LEA. OffSec vendors may have contributed more than we think to help silence the discussion about legal backdoors. Personally I prefer this over mandates to key-escrow and legal backdoors.
I'm not sure how Signal functions internally. Does it act as a server that just facilitates E2EE tunnels between clients? In that case wouldn't they still have metadata of which clients were connecting through the app at a certain time, if not which other clients they were paired with?
When a reaches Signal's servers, all Signal knows is the destination and a timestamp. It even hides the sender via a process called sealed sender if you want to look it up.
So all they have is your phone number, date when you've registered, and last time you've used it. If someone took over their servers, all they'd see are encrypted blobs and their destination. They have no reason to keep those blobs after they've been delivered.
Note that Signal's architecture makes this difficult. You can observe metadata going to the Signal server and going from either the Signal server or from Google's blobs-to-apps service. Assuming that my phone uses the Google service (it can also use websockets directly from Signal), then you can see thee things:
1. You can see that my wife sends someone a message via Signal.
2. You can see a stream of data from Signal to Google.
3. You can see that an app on my phone receives a small amount of data from Google.
You can still pick out something, but the architecture tilts the statistics and widens the error bars on your results.
i still don't understand how they can route messages to devices without having some kind of registry as well as leaking the destination out of the encrypted payload..
edit : root -> route :)) talk about a freudian slip :)))
The verb is route. Rooting a phone... you made me sit up there ;)
They do have a registry, but it's not relevant to this order. When you send me a message, your phone sends Signal my phone number and an encrypted blob. Signal routes that to my phone and the server doesn't keep a long-term record of that. Thus, Signal-the-server has a record for the next many seconds at least containing the destination of a messsage. It does not have my profile name, your profile name or any way to connect your phone number to that message.
Signal running on my phone has enough state concerning you to decrypt the encrypted blob and see that it's from you.
Signal, the organisation, is in control of all the source code and could change the code to gain control of all the information Luxembourg is asking for. But the past is immutable. Noone can go back in time and retroactively store more data. Signal didn't log the destination phone number at the time and so it's not available now, Signal didn't transmit the profile names to the server and so it never arrived and could not be stored.
Nah, the treaty of Westphalia holds. Countries are sovereign in their territory and basically powerless outside it. Luxembourg doesn't compel foreign foundations in foreign jurisdictions.
Luxembourg has the legal right to forbid people in its territory from using apps like Signal, though. But I doubt it would even try to that, because it's really difficult to write a scope that includes things like Signal messages and yet excludes things like messages that instruct and authorise your bank to transfer money.
Signal can just elect not to store that data, because they don't care. Realistically it might be possible to compel them to retain a log which phone numbers communicated with a specific device, but as I understand it after the initial handshake to register your phone number they also don't retain what it actually is - just the hash of it. So it would have to be a demand for forcible cooperation going forwards, since you can't recover data you don't have.
You're never going to make abstract metadata like volume of messages and relationships completely anonymous, but that's essentially an opsec issue which Signal is trying to solve by normalizing it anyway: if all your communications are always encrypted all the time, and you constantly use them, then there's no discontinuity if you go from "I don't like advertisers" too "I am in possession of the panama papers" - it all looks the same.
> You're never going to make abstract metadata like volume of messages and relationships completely anonymous
Even here Signal works very hard. Signal doesn't know group memberships, all the group metadata is encrypted and acted on based on cryptography, so somebody who is apparently authorised to administrate a group kicked somebody out of it, and then they sent a message to the group. Who was that admin? What's the group's name? Who is in it? Who did they kick out? What was the message? All deliberately unknown to Signal.
Also all messages to people who agree to accept your messages in advance (so, most people's friends, and those whistleblower hotline type setups at newspapers) are anonymous to Signal. You've got proof you're authorised to send this message, so they don't need to care who you are and they don't ask your client who it is. They need to know who the message is for to deliver it, but that's all.
they do have the means to push an app update that uploads decrypted messages somewhere, and I wonder to what extent governments can force them to do so. Signal users should not enable automatic updates, that's for sure.
If I were looking at this from the perspective of the Govt., it would most likely make sense to request data from Signal. There's nothing much to lose, really.
The transparency is rather nice, too. Reminds me of Njalla, but without the children's pictures.
>Because everything in Signal is end-to-end encrypted by default, the broad set of personal information that is typically easy to retrieve in other apps simply doesn’t exist on Signal’s servers.
Does this mean that Signal (and other open-source E2EE apps) is a great hiding place for the 'lost and banned', and 'bad' actors?
EDIT: Judging from the responses, it is admittedly a "Yes" then.
In the case of Signal unfortunately for everyone else, the service is centralised and some governments can and may get the ISPs to block Signal's servers instead.
Signal enables people like my mother-in-law to do it.
PGP enables people like... well, not her, and not me either, to be frank. I tried and failed. PGP did that for a definition of "people" that excludes most people.
This one is honest attempt in decentralization of IM: https://cwtch.im/ - worth supporting with donations because there is no token/ICO, no blockchain, and no bullshit from the author Sarah Jamie Lewis.
> because there is no token/ICO, no blockchain, and no bullshit
That is good to hear. Unfortunately for now, there is no iOS app which is a serious limitation in order to get others to move off from the alternatives.