This is not the first time such a law has been proposed. In 1997, a House of Representatives committee approved a ban on domestic encryption without backdoors for .gov access. Here's an excerpt from the SAFE Act, as it was called back then:
`Whoever, after January 31, 2000, sells in interstate or foreign commerce any encryption product that does not include features or functions permitting duly authorized persons immediate access to plaintext or immediate decryption capabilities shall be imprisoned for not more than 5 years, fined under this title, or both...
After January 31, 2000, it shall be unlawful for any person to manufacture for distribution, distribute, or import encryption products intended for sale or use in the United States, unless that product [...] permits immediate decryption of the encrypted data, including communications, upon the receipt of decryption information by an authorized party in possession of a facially valid order [and] allows the decryption of encrypted data, including communications, without the knowledge or cooperation of the person being investigated... http://thomas.loc.gov/cgi-bin/cpquery/T?&report=hr108p4&dbna...
Think of how that would have affected Linux (Android uses dm-crypt for FDE), open source, Github, etc.
That 1997 bill is remarkably similar to what the FBI and its law enforcement allies, including the district attorney quoted in the linked article, want today. And remember that bill was not theoretical. It was approved and sent to the House floor for a vote -- and was defeated only because of a hastily-assembled alliance of tech firms and privacy groups.
I disclosed in a 2012 article for CNET, before I left to found http://recent.io/, that FBI general counsel's office has drafted related legislation mandating backdoors even before the current flap over Android and iOS FDE.
"If you create a service, product, or app that allows a user to communicate, you get the privilege of adding that extra coding," an industry representative who has reviewed the FBI's draft legislation told CNET.http://www.cnet.com/news/fbi-we-need-wiretap-ready-web-sites...
> Think of how that would have affected Linux (Android uses dm-crypt for FDE), open source, Github, etc
A concern of mine is that unlike the others you mention, Apples software isn't open - we have no ability to check
there aren't already backdoors resulting from secret court orders.
Perhaps going forward 2015 going "open" will be the only way to build trust with a client is offer an open or reproducible means to replicate any claims made about suitability of encryption and lack of a backdoor both on client and server.
Apple and Microsoft have provided source code reviews to customers of a large enough size. The federal government, for example, reviewed sections of Windows source code before installing it on things like Navy ships, NSA computers, etc. Of course, this does not do us regular schmoes much good.
That said, this problem is a much better problem to manage than the one declan is talking about. Imagine if you wanted to use open source encryption to secure your information but were legally prohibited from doing so. That's what law enforcement is talking about now (and as declan points out: again).
It's un-enforceable as well unless, ultimately you'd end up needing to police the consumer.
This sort of misguided action in today's technology culture will just result in an exodus of hardware manufacturers + technology service providers and shrinking demand & trust for US based technology.
This is already happening to some extent, illustrated by the boom in ASPs & Cloud computing occurring in Germany, elsewhere in Europe and to a lesser extent in Switzerland.
Also consider the amount of malware as uninformed Android users download god knows what. If they use the Play Store they are safer and Google tries to stress that fact but everybody loves free stuff.
> Think of how that would have affected Linux (Android uses dm-crypt for FDE), open source, Github, etc.
Probably the same way as the old USA cryptography export restrictions. Back then, cryptographic components were distributed in separate "non-US" repositories, which were hosted outside the USA.
>Probably the same way as the old USA cryptography export restrictions.
Actually it would be much worse. Even in the 1990s, you could release any crypto code (or ship any crypto product) you wanted as long as you took some fig-leaf steps to limit it to domestic use. The SAFE Act would have outlawed that.
Put another way, the Feds wanted to but were unable to prosecute Phil Zimmermann for releasing PGP. But if the SAFE Act had been enacted at the time, they would have -- because it outlaws the domestic "distribution" of non-escrowed crypto.
If you wanted to work on crypto, you'd have to move overseas. And you'd also have to hope that you wouldn't be prosecuted upon your return. The SAFE Act probably wouldn't be interpreted as an extraterritorial criminal law, but "probably" is a weak hook to hang your future and your freedom on.
The entities themselves would have to be (and remain) outside the US too, and given their doctrine in the Ireland email disclosure case Microsoft is fighting, even that would be tricky.
Greenyoda's comment that "speaking a language the police don't understand should be a crime by this logic" is I think the neatest encapsulation of the issues. It gets around all the technical issues and worries and strikes at the heart of the problem - the police should and can put away criminals without needing access to the private conversations the criminals have. Fingerprints, CCTV, loot stashed under the bed have all been good enough for a long time now.
The crimes that are committed solely online are few and mostly fraud by deception. There are clear issues with deceiving people in a language they don't understand.
Most peoples' eyes glaze over when you talk about things like Shannon's Law, cipher strength, entropy, backdoors, authentication vs encryption, trust models, metadata (which IS data, goddammit), and threat models.
This avoids all the complexity without losing the core issue.
"Federal and state governments should consider passing laws that forbid smartphones, tablets and other such devices from being 'sealed off from law enforcement,' Manhattan District Attorney Cyrus Vance said today..."
If they ever pass these laws, the next step would be to outlaw encrypting the hard drive on your laptop or using PGP to encrypt your e-mail. Also, speaking on the phone in a language that the police don't understand could be made a crime.
The Language thing is unnecessary. Since they record all calls they can find someone to translate later. But maybe you're right and that is too much trouble for them.
This is the problem I see with outlawing encryption entirely. At what point does something become "encrypted" w/r/t the law?
This may seem like a silly question, but I think it's a real concern:
Is rot13 encryption? (No, of course not!, you say.)
How about a shift cipher with an unknown distance? (Well... it's just as trivially breakable as rot13..., you say.)
How about a book cipher? This one I particularly like, being quite amenable to your idea of "encrypted speech": does reading out a book cipher over the phone constitute "encryption" in some sense? And I also considered it for the lavabit case: if the gov't has a wiretap warrant for your phone, under their reasoning in lavabit, couldn't they compel you to disclose the key to the book cipher in that case? That seems to be crossing the 5th amendment line there, to me; toeing the line of the 1st and 4th as well. And so, I thought at the time, it seems pretty obvious that the "assistance" clause of the wiretap law did /not/ compel revealing the key any more than it would compel revealing the key of the spoken book cipher.
This type of ambiguity is not uncommon in law. My guess would be that they would look at intent (ie, did you manipulate the message to make it difficult for a third party to understand), as well as defectiveness (ie, rot26 is so stupid that we will only charge you with attempted encryption).
This is really about law enforcement being too lazy to do their own jobs properly. Here's a story:
I run a dating website and was contacted by law enforcement to provide contact information for a user suspected of soliciting sex from a minor. The information was forwarded to police on behalf of the parent of the minor. As per our privacy policy, we informed the police that we will need a court authorized subpoena before handing over details about one of our users. They also informed me NOT to ban the user or otherwise disrupt his account in any way until they receive the evidence they need from me.
Weeks passed, then months, and finally I had our attorney reach out and contacted them again to ask what happened with the case, and it turns out the subpoena was blocked on some kind of administrative issue. They didn't bother telling us so that we could ban the suspect from the site sooner. In addition, they could have easily gotten the information they were looking for by using the dating site to act as a minor and get the information themselves directly from the suspect (they had the username). Our attorney told us that never once did the investigator log onto the site themselves.
This is one small story, but just goes to show you the extent of the laziness that pervades law enforcement today.
Perhaps lazy, perhaps working 20 other cases, or just lacking technical sophistication. If it was me it would be much simpler to issue a subpoena then engage in some sort of sting operation where things could go wrong.
“They’ve eliminated accessibility in order to market the product. Now that means we have to figure out how to solve a problem that we didn’t create.”
Ermm ... I'd argue that dragnet mass surveillance is exactly one of the causes. Good security should be the default position. That we've had poor security to date should be the considered the real aberration.
This is all bull. There are skilled IT forensics people who can access these devices without the permission of the owner as long as they seize them on.
As far as I know the key needs to be kept in ram which means it is vulnerable while the device is on.
This only prevents dragnet surveillance; nothing more.
The Manhattan DA didn't invent dragnet surveillance. It's way beyond his capabilities and he probably has zero access to what the NSA has captured. Besides, the NSA is not stealing data from local storage. They're getting it from anyone site that's storing user data.
> Earlier today Vance gave the keynote speech at the conference, hosted by the Federal Bureau of Investigation, saying he was going “rouge” by speaking out on the matter. He made an emotional plea that police might not be able to stop crimes against children or solve murders without access to the data.
They somehow solved these crimes before the advent of these devices.
Also, I personally regard stopping the NSA from spying on Americans - all Americans - without cause to be stopping a bigger, and more important, crime than stopping the number of crimes that the encryption would stop them from solving. It's stopping a crime against millions of people that corrupt government officials have not only refused to properly investigate and prosecute, but have shielded for personal gain, knowing that they were circumventing the law of the land.
Stopping rampant corruption is a good thing, and it's sad that it's fallen to public companies rather than government prosecutors.
> Also, I personally regard stopping the NSA from spying on Americans - all Americans - without cause to be stopping a bigger, and more important, crime than stopping the number of crimes that the encryption would stop them from solving.
I think it pales in comparison to spying on all unAmericans.
So they're saying companies should force consumers to forfeit their right to privacy? If a prosecutor wants access to a person's encrypted data, they should go after that person, not the company providing the encryption - they might as well ban HTTPS if they're really going in this direction.
If only there was a law whereby if they could they could describe what they were looking for on the device and demonstrate why they needed access to the device to a sufficent standard that they can then get access. Maybe that would solve their problem ?
But there isn't. If police go to a judge and show probable cause, they can get a warrant and break into that locked shed they think had evidence. A warrant doesn't help against an encrypted phone. Not that I agree with Vances position mind you.
If the police can describe what they are looking for on the encrypted volume and get a warrant for it then as per the fourth you provide the item or go to jail for contempt. ("We want the spreadsheet describing the fraudulent transactions") There was a case recently that ruled this way.
If they can't do that then it's a fishing expedition and you can't be compelled to decrypt the volume. There was another case that ruled this way.
Both are completely in line with the fourth amendment and IMHO correct.
However, I think that in the case of a password or passphrase (e.g. something that you know) that would trigger decryption of such a spreadsheet you can refuse to give that up under the Fifth Amendment under current caselaw.
I think that part is up in the air and will probably be decided by the supreme court sometime in the next few years.
Encryption on a per file basis is somewhat interesting. You've provided the item requested but do you have to provide it in an understandable format ? An interesting analogy would be if you had a document written in a language (say shorthand) only you could understand, could you be compelled to provide a translation ?
I believe judges have compelled people to surrender decryption keys.
It is in concept no different to having a magical safe which can't be opened by any method except the owner's key. The court does have the authority to compel you to surrender that key (or go to jail for contempt).
Of course we all know this is really about fishing expeditions and not warrants issued on probable cause.
> I believe judges have compelled people to surrender decryption keys.
Doesn't seem to be a widespread practice. As far as I can see, the only Federal Appeals court case to have confronted the issue ruled against compelling a suspect to decrypt their own laptop.
> It is in concept no different to having a magical safe which can't be opened by any method except the owner's key. The court does have the authority to compel you to surrender that key
It is different, but only because you have the explicit right to remain silent. The US government can legally compel you to do lots of things, but not talk (or otherwise communicate information that's potentially against your interest...generally).
But what about a safe? Isn't that the same... And if safes are required to have a back door, what keeps it out of the hands of theives.. and when does a bank vault differ from a safe? So you want high profile theft to be made easy just to make investigations easier? I don't think so. Rarely will an investigation come from or hinge on the data on a phone alone, or should it.
If the government weren't trying to run illegal phone searches and broadly collecting data, then Apple and Google wouldn't be under consumer pressure to encrypt. Not to mention a lot of said pressure came from people in the government.
More like lazy and unwilling to learn. You have to wonder if the guy saying this has even made a cursory effort to investigate the issue of if he is just doing the only thing he know how to do, run his mouth.
I was sarcastic. Obviously he didn't. But he thinks of the children. After all the cases when a children is kidnapped, we have a suspect and his phone in custody, but we are not able to see the data inside (metadata about his movements we have though) which is critical to finding it are so prevalent ...
No, It was public and hosted by Fordham University. It did have 'authoritarian' sponsorship though by the FBI and Deloitte. http://iccs.fordham.edu/program/iccs2015/
There is something about technology that gives law enforcement a tingle. Look at Vance's high profile cases http://manhattanda.org/press-release : arson, multiple cases of embezzlement, securities fraud, sexual assault... Not one case that needs to break encryption to convict.
The lesson is that as soon as they realize they can't do anything about it, it will cease to be a hot button issue. Just like the rest of the limitations on law enforcement. Encryption should be pervasive and routine. Then it will be ignored.
Is Vance really doing his job? The article says "“It’s developed into a sort of high-stakes game,” Vance said. “They’ve eliminated accessibility in order to market the product. " this implies it is something the average consumer wants (there aren't enough criminals to make it marketable) again, this is something the consumers want. Vance should not be trying to prevent it from happening, just because it might save the children. Crimes were solved before, and they will continue to be solved. And people like Vance need to be fired for not doing there job.
"He made an emotional plea that police might not be able to stop crimes against children or solve murders without access to the data."
It's FOR THE CHILDREN! Of course.
Frontline has a great 2-part documentary on the NSA mass-surveillance program called "United States of Secrets" [1]. (It is available on Netflix)
One of the striking things I took away from that documentary was the rhetoric used to disuade would-be leakers, whistle-blowers, and media outlets that had learned about "the program". In almost every case they would be brought into the White House and briefed in an effort to convince them to keep their silence. The common refrain that was related by all the different parties was the White House (both administrations) would consistently tell the leakers: If you do this the blood will be on your hands for the next terrorist attack.
Sadly this sort of rhetoric really work(ed/s) on people.
I love this quote... "Now that means we have to figure out how to solve a problem that we didn’t create."
But they did create the problem via excessive blind collection of data, illegal warrantless searches. Falsified investigation trails and a number of other issues not-withstanding...
It'd be like telling safe manufacturers that they have to design in a trivial bypass for their safes so that law enforcement can get in easier... it wouldn't fly.
Well, more strongly than that, I don't see where there's any "problem". Police agencies can now issue warrants to force someone holding encrypted data to decrypt it. I've not seen anyone seriously claiming that there's a problem with people opening data that's been requested in this legal manner.
They're trying to solve a problem that doesn't exist, and paying for their solution with our privacy.
If you see this coming down the road, the way to stop it is through political grassroots action.
And before the cynics chime with all the reasons that can't work, consider that it DID work when this exact same issue came before Congress in the late 1990s.
I wonder whether there is any legal basis for idea that technologies should be circumventable by law enforcement. I'm not talking about particular laws but rationales - something in judicial decision or common law.
I also wonder whether safe manufacturers ever were obligated to make safes that were accessible by law enforcement.
`Whoever, after January 31, 2000, sells in interstate or foreign commerce any encryption product that does not include features or functions permitting duly authorized persons immediate access to plaintext or immediate decryption capabilities shall be imprisoned for not more than 5 years, fined under this title, or both...
After January 31, 2000, it shall be unlawful for any person to manufacture for distribution, distribute, or import encryption products intended for sale or use in the United States, unless that product [...] permits immediate decryption of the encrypted data, including communications, upon the receipt of decryption information by an authorized party in possession of a facially valid order [and] allows the decryption of encrypted data, including communications, without the knowledge or cooperation of the person being investigated... http://thomas.loc.gov/cgi-bin/cpquery/T?&report=hr108p4&dbna...
Think of how that would have affected Linux (Android uses dm-crypt for FDE), open source, Github, etc.
That 1997 bill is remarkably similar to what the FBI and its law enforcement allies, including the district attorney quoted in the linked article, want today. And remember that bill was not theoretical. It was approved and sent to the House floor for a vote -- and was defeated only because of a hastily-assembled alliance of tech firms and privacy groups.
I disclosed in a 2012 article for CNET, before I left to found http://recent.io/, that FBI general counsel's office has drafted related legislation mandating backdoors even before the current flap over Android and iOS FDE.
"If you create a service, product, or app that allows a user to communicate, you get the privilege of adding that extra coding," an industry representative who has reviewed the FBI's draft legislation told CNET. http://www.cnet.com/news/fbi-we-need-wiretap-ready-web-sites...
HN readers may want to pay attention...