> To revive this, they would have to find an expert to attest that it is technically feasible to have security with a backdoor that government can access, but at the same time is impossible for malicious entities to access.
> Ergo, this is technically dead, which is the best form of dead.
Except it's not. There exist such cryptographic trapdoor constructions that are perfectly secure, if the government backdoor key is kept safe.
The problem is keeping the government backdoor key safe. But that's not a literal impossible technical problem. It's much more a social problem.
Don't get me wrong, I really, really wish what you said was true and we could kill this garbage forever by nature of technical argument. But it isn't, so we must keep fighting against it for the real reason: we simply don't want this.
well, by definition if the key is to be used, and to be used more than once, it cannot be kept safe. The key has to go through multiple hands on its way from the senior government official responsible for its safekeeping to the peon assigned to unlock a specific phone at a specific point in time. It could be copied at any one of those points. No amount of technology or cryptography can solve the master key problem. The social problem is the technical problem, they aren't distinct.
So you just make the three companies keep the keys then. People are out here like "a secure backdoor to encryption is impossible" and then don't even blink for the keys for root CAs which is the basis for the world's online security. Or the AWS managed S3 encryption keys.
There's a lot of of hopium in this thread for people who I think want it to be more impossible in practice than it really is.
It was never even suggested that the government would have encryption keys. The government do not have access to SSL traffic, but companies are responsible for CSAM uploaded over SSL.
If a software signing key is compromised it can be revoked and a few weeks later the risk is only to people who don't keep their OS up to date. Further, exploited compromises are detectable, especially if exploited at scale.
If the backdoor crypto key is compromised, sure they can revoke it (assuming they manage to design a competent system), but all the sensitive information up that point is now available to whoever possesses the backdoor key. Unlike the software signing case, exploitation of the compromise is likely undetectable unless the attacker reveals their knowledge somehow.
The same is true of SSL traffic to a bank though isn't it? If a crime group is intercepted encrypted traffic and saving it, then the keys are stolen, they can decrypt that data.
But opponents of the OSB claim it will make communication with your bank less secure - how?
> Microsoft, Google, Apple etc are keeping the keys that allow you to push updates secret, aren't they?
From yesterday:
> the China-Based threat actor, Storm-0558, used an acquired Microsoft account (MSA) consumer key to forge tokens to access OWA and Outlook.com. Upon identifying that the threat actor had acquired the consumer key, Microsoft performed a comprehensive technical investigation into the acquisition of the Microsoft account consumer signing key, including how it was used to access enterprise email.
> There exist such cryptographic trapdoor constructions that are perfectly secure, if the government backdoor key is kept safe.
A big problem with this statement is the term “the government”. If you give the private key to the UK - the US, India and China will want a copy as well.
The UK might want to spy on foreign nationals only in the country and only for CSAM, but that doesn’t mean other nations won’t use it for more traditional espionage.
The ostensible reasoning is "think of the children" horseshit, but history proves such a powerful capability will be abused for unrestrained spying.
Key escrow for the entire US and world was floated with the Clipper chip (1993-1996). That was strangled in its crib because trusting thousands of people at NSA or GCHQ to just not stalk people is sheer fantasy, just as the Snowden leaks revealed.
iMessage stores the e2ee key in iCloud by default, which effectively makes all of a user's communications decryptable by governments and Apple at any time.
To offer a centralized service with actual privacy without zero knowledge p2p constructions, then it falls victim to the Lavabit problem. If you want security and plausible anonymity across your own devices, not metadata, then use a fork of Signal such as Session. (Signal is irreparably broken by being tied to phone number, which is a universal tracking device. The only people who use Signal are drug dealers and software engineers who don't know any better.)
Why is Signal "irreparably" broken? What makes the phone number issue "irreparable"? As I understand it usernames and phone number privacy are in the pipeline.
I'm a software engineer who does know; I'm aware that Signal is currently tied to phone numbers, and I'd love for it not to be, but I still use it, because it's E2EE and easy for non-technical people to use.
When there's something that's easy to use like Signal that uses decentralized cryptographic identifiers and onion routes all traffic, I'll start trying to get people to use that. I'd be happy to hear any recommendations.
If you have a mobile phone number, the domestic intelligence agency knows exactly where you are at all times and any LEO (without a warrant) can also find you. In addition, there have been numerous CCC presentations showing how insecure the global (excluding US) and (separately) US carriers are guilty of promiscuous metadata trafficking ($$) and insecure SS7 setups. As a consequence, for low $, you can go to any one of several shady websites and find the last location of almost any phone number (person unique ID) globally. There are additional varying exploitable vulnerabilities depending on the exact combination of {handset x carrier x country} to impersonate them, tap their line, reveal their exact location, and redirect their phone number through a third-party handset or even a PBX. These are more expensive and some capabilities are forbidden for all but a few selective intelligence uses.
Session (Signal fork) doesn't use phone numbers. It's pretty well-designed overall and uses an onion routing approach. It's already a superset of Signal except it doesn't use phone numbers. https://getsession.org
PS: Using regular TOR on home broadband or cloud servers is relatively risky and inefficient. Sybil attacks on it are common. And to network operators and security agencies it gives an easy "flow tag" of your uplink and exit node data traffic as automatically suspicious.
And don't be naive. The UK absolutely wants it so that it can surveil both it's citizens and everyone else (including within other governments) around the world.
Ye I find it somewhat amusing that sharing a private key with the government is technically impossible. I guess you could be philosophical about whether it is private though, in that case.
Anyway, I am gladly surprised they seem to back off.
> Ergo, this is technically dead, which is the best form of dead.
Except it's not. There exist such cryptographic trapdoor constructions that are perfectly secure, if the government backdoor key is kept safe.
The problem is keeping the government backdoor key safe. But that's not a literal impossible technical problem. It's much more a social problem.
Don't get me wrong, I really, really wish what you said was true and we could kill this garbage forever by nature of technical argument. But it isn't, so we must keep fighting against it for the real reason: we simply don't want this.