Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
FBI’s Comey Concedes Mistake Was Made Over iPhone in San Bernardino Case (wsj.com)
293 points by gist on March 1, 2016 | hide | past | favorite | 204 comments


"During his testimony today, Comey dismissed the notion that Apple’s assistance in the San Bernardino case would impact other phones, reiterating his belief that any code Apple created to help in this case would only work on Farook’s phone."

And that belief is based on what exactly?

Apple has being saying the opposite. Apple doesn't know it's own code? FBI knows it better.


It's true, but very narrow, I think. FBI means that Apple could sign the update to only work on that phone. Apple means that once the compromised version of the OS is built, the only thing stopping it from being widespread is changing the device id check code to other phones or taking it out entirely.


Right. If you look at it from a security point of view, once the compromised OS is created you've created a much more valuable and vulnerable target for hacking.

Let's say that some attacker wants to create a compromised OS and install it on a certain device.

If apple never creates the compromised OS, they would need to hack into apple, get all of the source code necessary to build iOS, figure out how to build it, figure out how to modify it in the desired ways, how to get it installed on a phone, steal the crypto keys necessary to do the signing, and sign the bad build.

If apple has created the compromised OS, they would just need to hack into apple and get the compromised OS build, steal the crypto keys, and sign it.

The first scenario is a large-scale software engineering project. Anyone that's been given a large source dump will tell you that it's horrible and takes forever to do anything, and iOS is going to be absolutely huge and tricky. You'd need a large, highly trained team of security/OS devs, which is hard to come by and would be extremely expensive.

The second scenario could conceivably be done by a single hacker, if they can find vulnerabilities in apple's security.


Apple also has a huge firewall (in the figurative sense) right now in that it is a large amount of effort for them to create this new security-relaxed version of the OS, and the government can't be compelled to force them to produce it.

Now, let's say that they have written it for some reason, but it is restricted to a single device id. Well, it's now a lot easier for the government to compel Apple to hack another phone, because they can creditably argue that all Apple has to do is change some string constant and re-sign the package. The burden of work is now much, much less than if the tool itself doesn't already exist.

Apple doesn't want to ever create the tool. If they have to create it for any reason, even if it starts out being locked to a single device id, they've lost the war.


You while it makes things easier you don't need to have a source code to break protections.

Crackers broke copy protections for decades without having access to source code of protected games.

The only thing you would need is to have access to private key needed to sign the new code so that phone will accept it, but even that could be broken by hardware engineer.

Anyway the whole thing does not make much sense. Those shooters are already dead, they destroyed their private phones, this was a work phone, they already can access metadata (outgoing/incoming calls etc) from cell provider. FBI went public with this even though in their best interest would be to do it secretly. What does FBI expect in doing this publicly? Did they expect is to cheer for them and complain about evil Apple not helping to break evil terrorists' phone?

It doesn't make much sense... unless the real goal was to make people trust Apple more after Snowden's disclosures. Isn't interesting that Google, Facebook, Microsoft... every company which was previously involved in PRISM supporting Apple? Trusting them benefits both, the agencies and those corporations.


> Did they expect us to cheer for them and complain about evil Apple not helping to break evil terrorists' phone?

I think that is exactly what they expected. Terrorists and pedophiles are the best way for federal TLAs to expand their powers.


Except digital signing makes the compromised OS totally and utterly useless for other phones. Changing the OS would cause the signature check to fail.

And if you can get around the digital signage, you don't need the compromised OS.

Conway's technical interpretation of the Apple deliverables is right. There's a legal precedent which could cause reuse (and is rightly matter for debate/utter refusal of the FBI position), but if you just debate the technical merits Apple has been very misleading about the consequences.


For this one case you're right because the FBI will allow Apple to lock the special OS build to this device's ID. The problem is if the FBI can force Apple to create a special build to order, with features specified by the FBI, they can also be ordered by the FBI to create an OS build that isn't locked to one device. And if the FBI can make them do this, so can any law enforcement or government agency capable of finding an amenable judge, such as say the CIA, the DEA, the NSA, or any random public prosecutor. THAT is the problem.


Not to mention that once this is done in the US, what is to prevent other governments in countries where Apple does business to compel Apple to do the same?

China (or Russia or Germany or whoever) could force Apple to backdoor phones used by CIA informants in that country.


And the fact that who is to assure that Apple doesn't leave one or multiple bugs around that makes the compromised OS not so tied to a single device as they meant it to be?

It's a ticking bomb, man.


The FBI can't legally do any of that.


"Except digital signing makes the compromised OS totally and utterly useless for other phones."

This carries with it the assumption that the digital signing and verification mechanisms are infallible and impervious to attack. That is an unwise assumption. Even if a software system appears to be perfectly secure at a given time, it is reasonable to assume that at some point a vulnerability will be discovered.


> And if you can get around the digital signage, you don't need the compromised OS.

Not necessarily. Someone could get their hands on the signing keys or find a vulnerability in the signature verification without having the knowledge or resources to create something worth signing. Or figure out a way to bypass the check by changing something that isn't covered by the signature, or use something like rowhammer or hardware hacking to flip the bit from saying the check failed to saying the check passed, etc.


It is useful on other phones if someone figures out how to hack whatever mechanism is used to do the phone ID check. If that happens, suddenly this patch works vs all phones


> And if you can get around the digital signage, you don't need the compromised OS.

signing <> encrypting


Find?

Wait three weeks or three months for the FBI to request n copies of the evil thing, tailored to each of the n phones it wants to open. Better still, wait for a few others to make similar requests. Now penetrate or impersonate a law enforcement agency of your choice and send Apple a routine request for the n+1th copy, tailored to the phone of your choice.


> steal the crypto keys

Once you have done that, the other steps are easy.


That is an interesting assertion that you do not back up in any way. I don't know about you I don't see the other steps as anything like easy. And I've been doing this software thing for a while now so I think I some benefit from experience to draw on.


Every single version of the iOS kernel has been dumped. That gives you most [0] of what you need to craft a modified version. The largest barrier to running these modified versions is getting the target hardware to accept them as authentic. All public bootrom/iBoot exploits on the iPhone 3GS/4 patch the bootloaders' RSA authentication out in some form or another. There are no public bootrom exploits out for iPhone 4S+ devices.

Thus, having the signing key (or the power to compel signing at will) is an incredible ability privy only to Apple.

[0] Some Mach-O information is lost. Decryption of the imgX formatted kernel is preferable.



How hard do you think it is to sign software with keys you already have? Cracking software to avoid erasing the device and to support talking to the security hardware without a timeout... I don't even know what one would think is hard there.


Why do you say that the first scenario requires a large-scale software engineering effort? They don't want new features, or major changes to the crypto systems - just disable the function that causes the phone to lock for longer and longer times when wrong PIN codes are entered.

Comment out the function call. Change the number of allowed guesses to MAX_INT. Change the time increment to zero. Click build.

This is not a hard task!


That's a good point that I was assuming it would be difficult. I've since done a little bit of reading up on what we know about how difficult it would be.

This is the best writeup I found: http://blog.trailofbits.com/2016/02/17/apple-can-comply-with...

So, you'd need an update to iOS/the phone firmware, and for newer devices you'd also need an update to the secure enclave firmware. You can't do anything about the 80ms delay, because that's baked into the hashing function (and changing the hashing function would generate invalid results). The FBI is also asking for the ability to enter passcodes electronically rather than via the touchscreen, which would be new code.

If iOS and the SE firmware are really nicely factored to disable security, and it's not hard to add the new functionality, then this might not be too much work. However I doubt that that is going to be the case. The whole point of the security system is to make it difficult to crack, so there might be other countermeasures involved, tricky dependencies, and low-level hardware hackery. If it were simple to do, why wouldn't it have already been done by others reverse-engineering the compiled code? There is certainly financial motivation to do so.


> Click Build

I'm willing to bet iOS has a huge build infrastructure, many different components, and about a snowball's chance in hell of having a single nice clean Makefile for you to type one command to get a build without access to that infrastructure.


But the point is that Apple has that infrastructure. If the FBI asked the right person or people, I would bet they could get this done on their lunch break.

People are making it out like the FBI is asking Apple to rewrite a big part of iOS. That's not the part of the request that's the problem.


It depends on the source code you find: if you find a source code without comments or even worse an obfuscated code it would be a really really difficult task to modify it the right way.


But what it's really about is the legal precedent it would set, allowing the government to force companies to unlock devices for them.


It's not the unlocking that's the precedent. It's the door into "modify your source code in such and such a way."

Come to think of it, why aren't free market standard bearers rallying against this as government intrusion into market features?


The answer is they are and this is somewhat a litmus test for who stands behind free markets and those who don't despite what they often claim.

Example from organization supporting free markets. http://fee.org/articles/apple-defies-fbi/

And then there is no shortage of examples from the presidential candidates who claim to support free markets, yet none are standing behind Apple.


Because it isn't politically expedient for them to do so, as Apple is very very successful, and as such, most people (voters in this case) who are uninformed love the theater and cheap political pot shots lobbed at Apple. It's clear that there is no discussion at a national nor international level of the actual implications of what this means for not just Apple, but indeed for the American economy and software built in the US.

For instance, do you really think Microsoft will be able to sell Office software to the Netherlands Government if the DOJ/NSA/whoever can use the All-Writs Act to force Microsoft to implement a backdoor into their software? Would the NSA be able to use the AWA in conjunction with a NSL and a secret court to force the hand of companies? Politicians / the public don't really grasp what's at stake here. What we're really talking about creating a complete and real artificial handicap for all software companies located/based out of the US. Already there is pushback in China, Europe and Australia to ditch American-made software after the Snowden revelations. A ruling in favor of the FBI will only compound and accelerate this issue and will have a marked and measurable effect on the revenues of software and hardware companies located in the US.

While Google/Apple/Facebook/Microsoft/Cisco et al may not be able to relocate, this will definitely cause small and medium sized firms to relocate or possibly to never incorporate within the US to begin with. This may be effectively and preemptively scaring away the next Google. Law of unintended consequences and all that.


Cause most of them never were actually for the free market.

Also: Terrorism.


Correct. That's what I should have written.


"why aren't free market standard bearers rallying against this"

because it has nothing to do with the "free market".


Even worse than that, the precedent will be set that government agencies can ask any tech company to subvert their own security. So TVs, phones, echoes, webcams, computers, analytics.js could all legally be modified to become surveillance devices, and if doing one, why not do them all?


My question is: why hasn't the fbi gone after a smaller player to set this kind of precedent? One that wouldn't have had the huge legal resources to oppose the request that Apple has.

I think what's coming out of this is that the Fbi is riddled with incompetence and inability to face modern threats, plus a silly hybris that is the foundation for silly strategical mistakes.


Presumably because smaller players don't have such elaborate security. Those can always argue that the government should use one of the well-known exploits.

I also could imagine that Apple would aid such a case anyways.


There is nothing in the digital signature check that allows it to be locked to a device, so that's logic that has to kick in AFTER the trusted layer has validated the code. At that point, it is a simple matter of altering the device ID check in unsecured RAM and you've now got another cracked phone.


Actually, I would argue that this is not true. Before installing, the device wants a ticket to be signed by apple that contains a hash of the firmware to be installed, the phone's identifier and a nonce it has just generated. See here: https://www.theiphonewiki.com/wiki/SHSH

So by not signing any requests for that particular firmware hash, Apple can effectively neuter that firmware and make sure it's never installed anywhere but on the target phone.

The problem is though: If apple can be compelled to do this once, they can also be compelled to do this any other time.


That's not part of the boot chain. That's for OTA updates.


So would a signature check on the trusted layer against a signature generated with the device id (you'd need to distribute a different binary against every device id) permit the generation of an OS image that could only run on a single device?


It would, in theory. If there weren't any catches with this approach though... Apple could have avoided having itself in this position in the first place.


Once you've changed the device id check code, you'd still need to sign it again if you wanted to distribute it widely


Sure, but once the FBI has forced Apple to write the code, forcing them to update the device ID and sign the new build is trivial by comparison, which means that breaking into any random iPhone will become routine.


Is it possible to change a phone's device id to match the initial target, though?


I believe the FBI is suggesting that Apple tie the update to the phone's IMEI, which I believe phone thieves routinely change by desoldering and replacing a chip.


Apple firmware updates are signed on a per-install basis.


Not sure why this got downvoted, I'm not too familiar with iOS but AFAIK this is exactly how the SHSH system works with the modern iPhones.

Quick googling seems to support this.


I didn't downvote you, but I think you're being downvoted because the information content isn't much more than "but cryptography something something!"

I mentioned that the most common method for uniquely identifying a handset (the IMEI) can be changed by switching a chip on the iPhone's main board. (At least this was true 6 years ago.)

So, unless Apple uses an interactive signature scheme or prevents the FBI/intelligence agencies from ever seeing the signature (using TLS with hard-coded certs), then the signature can be replayed.

If the signature can be replayed, then in order to prevent FBiOS being used on multiple phones, it must be tied to one or more unique identifiers, probably excluding the IMEI.

Many people understood my post as shorthand for the above. Responding to this with "[But] Apple firmware updates are signed on a per-install basis." doesn't add to the conversation unless you provide further details. At least, that's my best guess as to why you've been downvoted.


>I mentioned that the most common method for uniquely identifying a handset (the IMEI) can be changed by switching a chip on the iPhone's main board. (At least this was true 6 years ago.)

https://www.theiphonewiki.com/wiki/ECID Firmware updates use this, not IMEIs. And I think the IMEI is more commonly used to identify the radio, not the device itself. But I could be wrong about that.

>So, unless Apple uses an interactive signature scheme or prevents the FBI/intelligence agencies from ever seeing the signature (using TLS with hard-coded certs), then the signature can be replayed.

Every time you update an iPhone it generates a nonce, called APTicket. Apple signs that, your ECID and the firmware. The nonce essentially makes replay attacks impossible, even if you managed to swap a devices ECID.


Thanks for the research! If they're signing the ECID using an interactive signature algorithm, then it sounds like they've thought it through pretty well.

> And I think the IMEI is more commonly used to identify the radio, not the device itself.

Across manufactures, I'm not sure another quasi-unique identifier in common use.

> Every time you update an iPhone it generates a nonce, called APTicket. Apple signs that, your ECID and the firmware.

This is one variant of interactive signature scheme.


Yes, but that is not nearly the same burden as actually writing the compromised OS. It's probably as easy to compel them to sign the update as it is to compel them to turn over iCloud data.


They could easily request the device signing keys via a different case or again using the all-writs act stating that it's necessary for whatever.

Not turning over the encryption/signing keys would be followed up with jail time / contempt of court charges for any officers/developers/etc refusing to remand the keys into federal custody.


Yes someone can modify the OS image to take out the check for phone ID and then it can work on any phone.

Also the FBI would want to use it on other phones as well. They might modify it themselves to work with other phones.


Moreover, once the compromised OS is created, Apple will be compelled to unlock iPhones in every country where it does business, including countries like China and Russia.


Apple releases updates for their phones often. They could engineer a hack that can then be patched by new versions of iOS.


> Comey dismissed the notion

Seems just for a moment, because, via Reuters:

http://www.reuters.com/article/us-apple-encryption-congress-...

"FBI Director James Comey told a congressional panel on Tuesday that a final court ruling forcing Apple Inc (AAPL.O) to give the FBI data from an iPhone used by one of the San Bernardino shooters would be “potentially precedential” in other cases where the agency might request similar cooperation from technology companies."

"Manhattan District Attorney Cyrus Vance testified in support of the FBI on Tuesday, arguing that default device encryption "severely harms" criminal prosecutions at the state level, including in cases in his district involving at least 175 iPhones."


Apple is talking about source code, the FBI is talking about a signed binary. I'm fairly certain Apple has the technical ability to create a signed binary that only executes on a single phone.


More importantly, once "GovtOS" (as Apple's filing calls it) is developed -- even if the government is billed $800K for the privilege -- each subsequent writ will be much less expensive to fulfill, creating a tidal wave of LEO requests to unlock phones. So Apple wants to head this off right now, because otherwise the floodgates will open.


Not necessarily. Apple could simply delete all code modified to make that change, necessitating a similar amount of work for each phone unlocked.


Apple has said that for legal reasons, it may be forced to keep the code permanently and will have to secure it permanently out of concern for future legal/court obligations specific to this case.


In support of this, someone forwarded me a very interesting article written by someone who creates forensic software for a living. The legal requirements surrounding the creation of a software tool for forensic purposes, which this proposed effort requested by the government might fall under, are nothing less than herculean in scope.

http://www.zdziarski.com/blog/?p=5645


Zdziarski's arguments are very illuminating on how this is not a simple or one-off request. Excellent read.


I'm sure defense counsel would want to be able to verify that it isn't modifying file access times, or deleting data, or planting data, or otherwise disturbing evidence when the update is put in.


If Apple had designed the iPhone to require user authentication before updating the software/firmware then they wouldn't be in this mess and they would not be able to comply with the court order short of hacking/jailbreaking the phone. If the pin was required to be entered before installing new software on the phone then the FBI would first need to know the pin to load GovOS on the phone so they would not be able to crack the pin using this method. And once Apple patches their software to require user authentication before installing updates they will no longer be able to comply with any similar type of request.


It's a pretty good bet that China, Russia, the NSA, and other state security agencies have access to Apple's source code (not by Apple providing it to them, but by having pwned an employee's laptop). If Apple creates the source code to do this, these state agencies will be a digital signature away from being able to crack any iPhone that ends up in their physical possession. This applies even if Apple deletes the source code soon after providing the binary to the FBI, since it will have been siphoned off the corp network while under development.

Still a good idea?


> "It's a pretty good bet that China, Russia, the NSA, and other state security agencies have access to Apple's source code (not by Apple providing it to them, but by having pwned an employee's laptop). If Apple creates the source code to do this, these state agencies will be a digital signature away from being able to crack any iPhone"

In the scenario you lay out, these security agencies are incapable of writing their own modifications to iOS, even though they possess the source to iOS.

Absolutely ridiculous. If they can steal the source and signing key, they certainly have access to the technical expertise to do it themselves.

I mean christ, exactly how complicated do you think this pin timeout logic is? If they can hire sufficiently skilled hackers, they can certainly hire sufficiently skilled developers.

The security of the system lies in the secrecy of the signing key. If they can meet that bar, they can surmount any other obstacle.


> These state agencies will be a digital signature away from being able to crack any iPhone that ends up in their physical possession.

Which in the current world, is about as far from having an exploit as one can be. Digital signing works pretty well.


Can you explain how that would be implemented cryptographically? Doesn't seem like an obvious feature to have included to me.


My understanding is that when you install iOS on an iPhone, an Apple server signs the OS as part of a challenge-response protocol. The challenge includes a unique device ID, and I believe the signed iOS is only installable on a device with that ID. http://www.saurik.com/id/12 has more details.

Think about this in the context of jailbreaking to understand why such a facility exists. Apple doesn't want users to install their own modifications to iOS, and they also don't want users to install old versions of iOS that have vulnerabilities that would allow people to modify the OS.

One way you could implement something like this is to have a public/private keypair within the device and have updates encrypted with the public key; then design the device to only run an OS that it could decrypt with its private key. To do this well, you would need a TPM that did not allow the private key to leave the device, nor to be reset.


All iOS software updates, even the normal ones, bear a digital signature that incorporates the device's UDID. The bulk of the software update is the same for all devices, but Apple must generate a new signature for each device using Apple's private signing key.


I don't know if Apple has any specific capability as part of the firmware verification, but even if they didn't they could just put something like this early in the boot process:

    if (unique_device_id != SAN_BERNARDINO_DEVICE_ID) {
        halt();
    }
If this code must be signed to execute then it can't be modified to work on another device without Apple signing it again.

This assumes there's a unique device ID that is known to the FBI and can't be tampered with. Maybe the serial number or IMEI?


Fixed that for you:

    if (unique_device_id != SAN_BERNARDINO_DEVICE_ID) {
        goto fail
    }


My understanding is that phone thieves routinely change the IMEI by desoldering and replacing a chip. If this weren't the case, I think it would be fairly easy for detectives to call up the person currently in possession of any given stolen iPhone.



It looks like there's something called a UDID which is a SHA-1 hash of a bunch of identifying information. So, difficult to fake even if you can twiddle the source values or swap in new chips.

https://www.theiphonewiki.com/wiki/UDID


Except they have the shooter's phone, which has the identifying information which results in the correct UDID. To get the same UDID on another phone they just need to change the source values to the same values as in the shooter's phone. The fact that it's a cryptographic hash doesn't really help here, assuming they can change all the source values at will.


I'm not sure how Apple could develop GovtOS without at least testing it on other iPhones.


Clearly its based on his deep and extensive 30 year work experience as a information security and cryptology developer at Apple. Additionally, I believe Comey was the first person to jailbreak the original iPhone. (sarcasm)


Comey in testimony today:

"Whatever the judge's decision is in California ... will be instructive for other courts, and there may well be other cases that involve the same kind of phone and the same operating system"

It's a little strange for him to dismiss the notion that this will set a precedent because it will just be for one phone and then imply that this case will set a precedent for other phones.


> It's a little strange for him to dismiss the notion that this will set a precedent

Here's the report that he confirmed the precedent:

"Comey told a congressional panel" "that a final court ruling" "would be “potentially precedential” in other cases where the agency might request similar cooperation from technology companies."

http://www.reuters.com/article/us-apple-encryption-congress-...


I think apple signs every single OS installation operation by using a mechanism Jailbreakers refer to as SHSH (https://www.theiphonewiki.com/wiki/SHSH), so it could be argued that, yes, Apple is in full control over what phone the firmware gets installed on.

However, if they can be compelled to do this once for one phone, they can be compelled to do this many more times for as many phones as the FBI or everyone else wants.

I would say: Both are right in this case.


Actually he didn't. I dont remember the exact words, but he made it clear that they were interested to set a precedent with this case, and that's what this whole case is about.


No, he's right.

Apple would have to change some config files to unlock the other phones, resulting in new code.


Comey's magical thinking knows no bounds.


I sat through the YouTube presentation (which is available at https://www.youtube.com/watch?v=g1GgnbN9oNw if you have four hours to spare). In general the discussions were quite well presented, apart from a couple of rabid questioners suggesting that public security trumped any one individuals' right to personal security.

I've written up a summary of the hearings at InfoQ here: http://www.infoq.com/news/2016/03/apple-fbi-congress

The general responses from Apple were that it wouldn't be possible to write this just for one phone; that once the FBiOS was available for one it would be available for any phone. Dr Landon also highlighted the fact that most people now use their phones as part of a two-factor authentication, and that the easiest way to break into a system (such as the IRS leak) is to compromise login credentials. The fear is, therefore, that once pandora's box is opened that the operating system would be installable on any device and thus potentially give access to any state actor to any account simply through device compromise.

The entire hearing was very well laid out, though in a few cases Apple's witnesses weren't quite as fluid as the FBI's.

The congress members have five days to ask additional questions, after which presumably a report will be made. More information is available at the House page here:

http://judiciary.house.gov/index.cfm/2016/3/the-encryption-t...


It’s fascinating to me how many laypeople are saying “public security is important, therefore we should unlock the phone” while many technologists are saying “public security is important, therefore we should absolutely not unlock the phone”.


>We’ve arranged a society based on science and technology, in which nobody understands anything about science and technology. And this combustible mixture of ignorance and power, sooner or later, is going to blow up in our faces. Who is running the science and technology in a democracy if the people don’t know anything about it?

-Carl Sagan


> “If I didn’t do that, I ought to be fired, honestly,”

There is a serious problem here. If you ask me any government official who forces a backdoor down the throat of a private company ought to be fired, yet he believes he should be fired if he DOESN'T do it.

There's a very big divide happening right now in the US. This is going to be a rough time.


I disagree. The FBI's job should be to pursue any and all opportunities at gathering evidence for its cases, within the bounds of the law. It is the job of the courts to sketch out those boundaries by interpretation and precedent. If everything is working properly, the courts will decide that the All Writs Act does not grant the Government this power. Then this matter will be settled and the FBI will have to find some other avenue.

Looking at it from the other angle, I think it's a bad precedent if the FBI were to refrain from pursuing an avenue which it believes it may have legal standing to pursue. At that point, it runs the risk of a downward spiral of second-guessing and apprehension.


I don't have a problem with the FBI making a request that is in accordance with their jobs. Let the courts sort it out.

What I do have a problem with is this little fucking PR parade Comey's going on to try to lobby the public against Apple in particular, and privacy and cryptography in general. He's been completely rhetorical this whole time, ignoring the fact that what he's asked for is a huge fucking deal, and forcefeeding "terrorism", "public safety" and other trigger words to get the public to rally behind something they couldn't possibly understand.

Make the request and shut your stupid mouth, Comey.


Consider the Geneva Convention. It's every combatants' obligation to see that the Convention's requirements are upheld. Disobeying direct orders to the contrary is a requirement.

This ... creates certain elements of friction and pressure.


In no world can the FBI be viewed as 'combatants.' They're a civilian law enforcement agency, and the Geneva convention definitely _does not_ apply to them. Ethically, yes, they shouldn't be doing this, but we can't pretend rules of war apply to a civilian agency with 0 role in combat.


My point is that high-order directives -- to obey the terms of the Geneva Convention, or to uphold the rights enumerated and limits of government stated in the US Constitution -- can and do in fact devolve upon all whom they govern.

Not merely courts.

Yes, I expect that courts would be tasked with drawing boundaries where extraordinary discretion is required. But throwing up your arms and saying "hey, we're the executive branch, not my problem" doesn't cut it. In fact that can very well land your ass in gaol or at the receiving end of a civil rights lawsuit (modulo limitations on liability for government officials, but that's another worm of cans).

Incidentally, I strongly recommend developing at least one or two abstract thinking skills if you're going to play around here. Helps smooth things along.


I strongly recommend you learn to debate without resorting to ad hominem attacks. Helps keep things thoughtful and productive, and not just here either. :)


That's not an ad hom. It is pointed personal commentary.


You're wrong.


An ad hominem isn't a random insult. It is a dismissing of an argument because of an irrellevant personal trait. E.g., yo're double jointed therefore you know nothing of erlang lathe forming.

Relevant personal attributes, especially those which address credibility, are generally in bounds (see also "the asymmetry of bullshit", a/k/a Brandolino's law).

And observations of personal character not used as basis for argument aren't ad homs. Full stop.

http://www.nizkor.org/features/fallacies/ad-hominem.html

If you're going to use logical fallacies, try to use them correctly.

And while the initial comment whose tone you disagreed with, and the paragraph above may not be seen as acceptable on HN (I tend to feel they skate within bounds), both point out a category of error, and steps a person might take to correct them. Insufficient abstract interpretation in the first case, incorrect use of specific terminology in this case.

Of course, you could simply insist on being wrong. But that's your call.

Cheers.


Yes, I agree it wasn't ad hominem, but you should really back up your insults. You just threw that out of left field, and provided no evidence that I lacked 'abstract thinking skills' or how they related to the rest of your comment at all. If the 'abstract thinking skills' was about inferring the context of your previous comment, I really hope you never accidentally miss some context in your life.

You can prove your point without being a dick, especially when being a dick does nothing to help prove your point. A little politeness goes a long way.


The parent comment was not making that comparison. They were not trying to imply that the FBI is a combatant on a battlefield. It was meant to be an example of friction caused by multi-directional pressure (in the FBI's case, to strictly obey the law and yet to be aggressive in solving crimes by any means within the law).


I guess I'm confused then, because that's precisely what the FBI is doing, and that's exactly what the grandparent comment is doing. The FBI didn't try to side-step the courts, it brought it to them directly and is indeed strictly following the law.


> it brought it to them directly and is indeed strictly following the law

If this is your definition of "strictly following the law" imagine yourself in a scenario where the FBI brings a lawsuit against you: risk going to court, losing, and going to jail, or hiring a lawyer and fighting something with dubious constitution grounding.

There are so many different ways to lose when the FBI plays this game against private companies or individual. They sue you because they want to put you in jail, not clarify the law.

If you're a strongly opinionated CEO of a company with a war chest you fight it and at the very least stay out of jail. If you're the CEO of a small ISP/web hosting company you suck it up, do what they say, and follow the gag order.


What would you propose then? Obviously someone needs to fight the law enforcement agents making these requests; we can't just tell the FBI "you just have to follow the law exactly as it's written, you can't challenge anything" because that would severely tie its wrists.

I do agree it's not quite fair to put defense costs on the company/individual that needs to fight it, but the alternative I can think of is a public defender type system that I don't think many people would be happy with for corporations.


> "you just have to follow the law exactly as it's written, you can't challenge anything"

Isn't that what we expect from all people and institutions? moreover, we expect the people to be in control of "challenges" to the laws, and the laws themselves. Much of what's wrong with government comes from government creating self-serving laws.


Laws are rigid and don't keep up with the changing times. You might argue that that's the government's fault for not passing new laws, and that's a fair point. However, the judicial has set a precedent over the past 200 years or so of interpreting laws to fit with changing circumstances. Given that's the case, it's hard to blame the FBI for taking advantage of that.

The laws are what the courts interpret them to be; no more and no less. Like it or not, strict interpretation is a fantasy and has been since the late 1700s.


Substantively correct, yes (as said commenter).


The FBI is part of the executive branch, as such we expect it to do the bidding of the government, which in turn we expect to the bidding of the populace. As such, no, of course the FBI should not pursue every legal opportunity when that would run counter to the governments intentions or beliefs.


> FBI's job should be to pursue any and all opportunities at gathering evidence

That's only true in a vacuum. In reality, every tinpot tyrant on the planet would like to have the US set a low bar so they can insist that US tech companies agree to their similar demands.


And because we have a three branches of government, it is up to the courts to make sure that the bar is set very high...


True but irrelevant. It isn't inherently law enforcement's job to probe constitutionality. That's like saying law enforcement should probe just how much force they should use against you before they get sued by the DoJ.

It's just as valid an argument to say they should be cautious about constitutionality and still get their job done.


It's hardly irrelevant. The parent post said "every tinpot tyrant on the planet would like to have the US set a low bar". In this case, it's important to remember that the executive isn't the only branch of government to have an opinion here.

No one said anything about probing constitutionality. There is a legitimate question about the limits of the All Writs Act. So they asked a court to issue the request.

Law enforcement must work within the confines of the law, but when there is a question of what those confines are -- and in the legal sense, that is most definitely an unanswered question here -- they ask the courts to decide.

Once the courts rule, I expect for the FBI/etc... to act in accordance with the ruling. And if you don't like the ruling, talk to Congress.


Honestly, intentionally doing anything that you knowingly believe to be against the Constitution should be a fire-able offense.

Upholding the Constitution is basically your primary job responsibility in Law Enforcement and the Military. No other duty should supercede that duty.


But, it's not the role of law enforcement to interpret the Constitution. So in cases such as this, where there isn't really any precedent or the law is ambiguous, the proper location for the decision is the courts.


> But, it's not the role of law enforcement to interpret the Constitution

You are correct that law enforcement does not provide the authoritative interpretation of the Constitution, but you seem to imply that they should operate under ignorance of the Constitution. This is wrong.

Law enforcement must have an understanding of the Constitution, and they must uphold it. For example, it's a dereliction of duty to always argue that any search is reasonable.


I implied nothing about the ignorance of the Constitution.

Law enforcement must have an understanding of all laws (for their jurisdiction), including the Constitution. However, when there is a legitimate question, it's not up to law enforcement to make the final judgement calls. It's for the judiciary to make that call.

It's fair to say that this is an argument that has been building for a while.

As such, it's very appropriate for the courts to make this call. And once that decision is made, it's up to law enforcement to act accordingly.

Note: I think the California court will reach the same decision as the Brooklyn court. But I don't have an problem with the DoJ/FBI raising the issue.


> However, when there is a legitimate question, it's not up to law enforcement to make the final judgement calls. It's for the judiciary to make that call.

That's not actually true. The executive and the legislature all have to swear to uphold/defend the constitution and it takes all three branches of government to violate it. If Congress says that something is unconstitutional and refuses to pass a law permitting it then the executive can't do it. (The FBI is nowhere in the constitution, its very existence is at the will of the legislature.) If the executive branch says that something is unconstitutional then they can refuse to do it. Nobody can force the executive to prosecute someone under a law they think is unconstitutional.

People only see the courts as the arbiters of the constitution because they're last. You only get there if the legislature is willing to pass the law and the executive is willing to enforce it. But that is a piss poor excuse for the other branches of government to neglect their oaths.


Actually, it is the role of law enforcement: https://www2.fbi.gov/publications/leb/2009/september2009/oat...

I [name] do solemnly swear (or affirm) that I will support and defend the Constitution of the United States against all enemies, foreign and domestic; that I will bear true faith and allegiance to the same; that I take this obligation freely, without any mental reservation or purpose of evasion; and that I will well and faithfully discharge the duties of the office on which I am about to enter. So help me God.


Where does interpretation enter into that oath?


> Where does interpretation enter into that oath?

In order to support and defend the Constitution, you must have an understanding of what it means (unless you interpret the oath -- itself part of the Constitution -- to mean physical support and defense of the physical document.) To have an understanding, you must interpret. The oath, therefore, requires interpretation.

(Now, the Constitution itself gives the judiciary the role of resolving controversies arising under the Constitution, which includes disputes arising from differing interpretations. But people -- including executive officers -- have to have interpretations before they can get to the point where such a dispute arises for the courts to settle.)


You can't read words without interpreting them.

Technology will always be moving faster than courts can define how the law applies, and thus interpretation is a part of law enforcement's daily job.


> Technology will always be moving faster than courts can define how the law applies, and thus interpretation is a part of law enforcement's daily job.

Imagine using the same logic in gun control. Imagine there were no laws against certain kinds of ammunition and law enforcement just went around saying certain guns are illegal just because what was in the officer's coffee that morning. People always throw around the phrase "we're a constitutional republic, not a democracy" when I say the Connecticut compromise ought to be scrapped but the same people are for a massive overreach by law enforcement.

Imagine someone shot and killed you in the street for no reason and said they thought it was their right. Would that absolve them? No. Neither should this kind of overreach by the FBI be legal.


It doesn't. If anything it implies "back off and behave." "Defend" doesn't generally mean "probe it for loopholes."


There is the situation where defending the Constitutional order against a clear threat most effectively demands actions that are questionable impinge on some rule of the Constitution, in that case, its at least arguable that working right up to the limit -- and perhaps even testing the limit -- is demanded.

Clearly, gratuitously testing the limits of the Constitution is not demanded by the oath, but I don't think that that's the claim Comey was making. (Note, I'm not saying that I agree that this is a case where a rational expectation of the value of breaking into the phone in question to protecting the U.S. and its Constitutional order does justify testing the limits of law enforcement authority, I'm just pointing out that its not unreasonable to think that there are situations in which upholding the oath might reasonably be seen to require testing the boundaries of Constitutional authority.)


> There is the situation where defending the Constitutional order against a clear threat most effectively demands actions that are questionable impinge on some rule of the Constitution, in that case, its at least arguable that working right up to the limit -- and perhaps even testing the limit -- is demanded.

Which, while technically correct (The Best Kind Of Correct), isn't the sort of thing the FBI runs into very often. Because the FBI is not constitutionally obligated to succeed in apprehending every criminal. If the FBI wanted to embody the principle of letting 10 guilty men go free before convicting one innocent then they would in practice run into a lot fewer prickly constitutional edge cases.


I think it's a good thing my doctor doesn't stab me in random places to defend my health. There might be some case where that's good medicine but I'm having a hard time with concrete examples.


It's the role of all three branches of government to interpret and obey the Constitution. The courts are just the ones with the final say on which interpretation gets accepted.


> Honestly, intentionally doing anything that you knowingly believe to be against the Constitution should be a fire-able offense.

Its already not merely a firing offense but a federal felony for law enforcement officers and other public officials, but the FBI is the lead agency enforcing that law, so its probably unlikely to be as effectively enforced when the Director of the FBI is involved. [0]

[0] https://www.fbi.gov/about-us/investigate/civilrights/color_o...


Not taking sides here, but this view is fine as long as everything is explicitly detailed in unambiguous language.

For example, the 4th Amendment protects against "unreasonable" search and seizure. There is a lot of leeway in what is or isn't reasonable.

If you believe in the system you should be glad it is going to the courts - it is the job of the judicial branch to interpret the law. And however it comes out will be a precedent for the future.


It more or less is, assuming you are acting under the color of law it is criminal (1). The trick is having this law be actually enforced.

(1) https://www.law.cornell.edu/uscode/text/18/242


>“If I didn’t do that, I ought to be fired, honestly,”

Let's take this on face value. He's suggesting POTUS would fire him for not fighting for backdoors, when POTUS has publicly come out against backdoors. So what is going on here?

I suspect Comey is just a fucking idiot. He's one of these old school guys who thinks LE should have an endless war on the citizenship. I suspect we're destined to have this fight every so often, and from a more practical perspective, not every company is Apple or has the financial and political power Apple has. How many others have agreed to the terms Comey demands? Perhaps many.


> Let's take this on face value. He's suggesting POTUS would fire him for not fighting for backdoors, when POTUS has publicly come out against backdoors. So what is going on here?

Some people (Comey) believe that what is said in public has no value? Or that those same people might instead believe that the POTUS is not their boss...


> I suspect Comey is just a fucking idiot.

I'd argue that he's being intentionally deceptive. For example, he testified* that "he's not a good lawyer", when in fact he served as the Deputy Attorney General during the Bush administration.

* http://www.motherjones.com/mojo/2016/02/james-comey-ducks-mo...


I know this sounds very tinfoil hat ish but I wonder whether something like this was the reason why Truecrypt shut down.


> During his testimony today, Comey dismissed the notion that Apple’s assistance in the San Bernardino case would impact other phones, reiterating his belief that any code Apple created to help in this case would only work on Farook’s phone.

I hate this kind of willful ignorance. That update, if properly signed, will work on similar phones. The software is distinct from the signing and it's the software that Apple doesn't want to create.

Or maybe Comey believes that every time his phone is updated, an engineer in Cupertino lovingly arranged the bits for him.


Artisanal bit wrangling. As a service.


Even if he's right, it sets precedent, which has the same effect. Doesn't really matter if Apple has to rewrite the code each time or just re-upload it each time.


It's outright deception, not willful ignorance.


Willful ignorance, huh? You're charitable.


Manhattan District Attorney Cyrus Vance sums up the DOJ's position quite succinctly. Emphasis added

> "Apple has created a technology which is default disk encryption. It didn't exist before. It exists now. Apple is now claiming a right of privacy about a technology that it just created. That right of privacy didn't exist before Apple created the technology." [1]

Wrong. The first and fourth amendments grant rights to privacy. The exact transcripts of what we say in the privacy of our homes, prior to a warranted wiretap or without witness testimony, are not the subject of law enforcement's investigation. Our entire history of digital communications should not be open for government surveillance. It would be overreach to try to implement, and anyway it is impossible to guarantee without destroying the US tech industry and turning us into a big brother state.

[1] https://youtu.be/g1GgnbN9oNw?t=4h46m22s


The fourth amendment specifically requires a warrant to violate someone's privacy. In all of the cases at hand a warrant has been issued.

There really isn't a fourth amendment issue.


You're right, and thanks for correcting me.

I do still disagree with the District Attorney's statement. We can write something on a piece of paper and burn it, or think something in our heads, and keep it private. That this is now possible with digital communications via encryption is not a new right, it is simply a new means to protect that right. Further, Apple didn't invent encryption, and this statement further shows how little the DOJ understands about technology.


Vance seems to be confusing the right to privacy with the ability to actually enforce it. Apparently, a "right" to privacy is all well and good, as long it can be casually violated and a warrant served later. As soon as the mechanism protecting your privacy has real teeth, then it's a problem.


> That right of privacy didn't exist before Apple created the technology.

Wait, is he actually arguing that the right of privacy didn't exist until a few years ago? There's literally decades of precedent, including digital precedent, for a right to privacy.


The FBI should offer to unlock the phone only if they arrest Hillary;)


The right to privacy stops when you put lead in a bunch of people. There isn't a textual right to privacy in the Constitution in the first place; it's a right that arises from the interpretation of other rights.

What's at stake here is the power of government to compel testimony. They already have the phone, but they lack the power to compel its unlocking.


> The right to privacy stops when you put lead in a bunch of people.

That's not how rights work.


Uh, in the case of search and seizure, it explicitly is. The Fourth Amendment provides a standard for searching the property of criminals. Such searches are carried out routinely with no issues. There are other standards for "violating" other rights to privacy also.

I'm not saying criminals don't deserve rights. In fact, what's at stake here is that the Fifth Amendment does protect a mass murderer from revealing his passcode, so I'm not sure what you and all these downvoters are going on about. If criminals didn't have rights there wouldn't be an issue here.

If the passcode on the phone were crackable, it would be absolutely 100% legal to search it. And that kind of thing happens routinely. The issue at stake is not searching the phone, but compelling Apple to help with the search.


"Oh wait, our attempt create a backdoor to every private citizen's iPhone, using a single terrorism case as a trojan horse, has been discovered? Oops sorry but we still need those backdoors."

It also amazes me he thought it was a helpful metaphor to say that the FBI doesn't want a backdoor, just for Apple to take the “vicious guard dog away” and “let us pick the lock.” I think every American would prefer having a guard dog protecting their personal property and data than just let these liars pick the lock.


The vicious guard dog metaphor is very poor, most likely planned to intentionally confuse non-technical people.

The reality is that he is asking Apple to weaken the front-door to the point where an average attacker can open it with a paper clip.


Comey also said,

They [Apple] sell phones, they don't sell civil liberties, they don't sell public safety, that's our business to worry about. [1]

He thinks public safety is entirely his domain and private companies cannot help people keep themselves safe. That's as bad as "self-made" business owners who cannot see that roads and infrastructure helped make their businesses successful and cry socialism whenever taxes are levied.

[1] https://youtu.be/g1GgnbN9oNw?t=3h16m18s


As technology has become more and more pervasive in society the role of security, at least in the preventative sense, has fallen more and more on the people who make the technology.


Does anyone know what happens if Apple loses engineers over this? Not that it's likely, but I could see someone saying "I came here to build secure systems, not break them". Does the government bear costs to replace or retrain? If enough quit, can the government compel them to stay to complete the task?


This is an excellent question. Congressman Darrel Issa had some great questions about this during today's hearing. Basically he hinted at the fact that how far can the government go to ensure they did their job correctly.


"We made a mistake, but you must still help us!"

You know, the sad part about this (paraphrased) quote is that it'll probably be used as justification by the California court system to rule on this case specifically rather than on the circumstance more broadly. I wonder if that's Comey's strategy in admitting the error.



Is there some reason this doesn't work for me?


Works for me. The "article" is super short anyway. The walled part barely adds anything


They updated recently so only those with Google bot user agents can break the glass


That link doesn't work for me either.


The Congressional hearing is happening now: http://www.c-span.org/networks/?channel=c-span-3


Lots of interesting stuff in this session.

"We are a rule-of-law country. The FBI is not cracking into your phone or listening to your communications except under the rule of law," says Comey (1:37). I suppose that's the NSA's job...

https://www.youtube.com/watch?v=g1GgnbN9oNw


TL;DR: "Yadda yadda terrorists something something guns kill people blah blah blah think of the children yadda AMERICA."

Truly, I'm getting weary of all the nearly-identical news of Uncle Sam's assault on digital privacy. If the strategy is to desensitize and wear down the US populace, it's working.


This seems a bit more complicated than simply saying they made a mistake, though. If they didn't do this, someone who knew the password could have wiped the backups or changed other info, and then everyone would be yelling at the FBI for not having the employer change the password. So it's a choice between potentially leaving the device/backups open to tampering, and whatever the hell the situation is now.


I'm putting on my pragmatism hat.

I can't understand why Apple are fighting this. They're essentially forcing the government into resorting to legal mechanisms that will have longer term impacts.

It's pretty clear to nearly everyone that this phone belonged to someone who committed a terrorist attack. The data on it may prove useful in preventing future attacks. If Apple were being cooperative, they could create the mechanism to get the data off the phone, have the FBI hand over the phone so they could do that for them, (optionally) delete that version of the OS if there's a problem with it even existing, and no legal precedent would be set. The question of whether the FBI should be able to compel them wouldn't arise, because they would have been cooperating off their own accord.

If the FBI asks them to do this to the iPhone of someone who is more likely to be innocent, perhaps then this debate should be had. For now, they have the means, can do it without setting a precedent, have a pretty good reason to do it, and can do it in isolation. They should just do it, in my opinion.


It isn't as simple as creating the tool, read this blog post that was linked elsewhere in this thread that explains why they can't just create and delete the tool.

http://www.zdziarski.com/blog/?p=5645


> The data on it may prove useful in preventing future attacks.

The data on it may also not prove useful in preventing future attacks. Especially as it was a person's business (not primary) phone.

The other issue here is that this will only grant safety insofar as terrorists don't do math or encryption themselves - and that could be through an app that they write or a foreign made android phone where they lock the bootloader with their own code signature.


I don't think the FBI can or will hand over the phone, that would compromise their entire investigation.

Besides, once the software exists and is signed by Apple, it's very hard to prove that all copies have been deleted.


It's difficult to imagine this recovery operation could happen if Apple never get access to the phone. Perhaps they could have representatives from both Apple & FBI present.

Proving all copies are gone isn't possible; you can't prove non-existence. Ensuring extreme unlikelihood seems relatively straightforward, though. Make fewer copies, trusted personnel only, isolate the development to a particular location with no network access, etc. Naturally there'll be a risk that a copy survives, just as there's a risk that a contingent of rogue Apple employees are already working on it for the FBI.


The phone belongs to San Bernadino County, and is administered by them. This entire effort is PR intended to set precedent or encourage new legislation.


Tuesday Adi Shamir at the RSA conference said that this is not the right case for this issue and that Apple should help the FBI in this particular case and I completely agree. Apple also has screwed up as well and should admit it. Apple would not even be able to comply with the FBI's request with the proper security architecture. The iPhone should require user authentication before updating the software/firmware. If it had required authentication first, then the FBI would need to enter the pin before installing any software, preventing them from installing this vulnerable version of the software to crack the pin. Then Apple would not be capable of helping the FBI in this way aside from hacking the iPhone. I hope that Apple fixes this problem by requiring user authentication before software updates so they can put this issue to rest. And it looks like they probably are ( http://www.nytimes.com/2016/02/25/technology/apple-is-said-t... ) They can then avoid using this technique in the future and avoid putting their signing key at greater risk.

From a security risk standpoint, Apple's arguments are fairly weak. They argue that this binary could potentially be leaked and used on other phones, but this would not be possible if they implement the solution I have already stated. Maybe an adversary could infect an OS with this code using a security vulnerability, but if they can do that then there are worse things they could already do. The best argument I have seen is that doing this will result in Apple trying to comply with requests from the government in bulk, and in doing so create an environment in which Apple signs many copies of this firmware on a much more frequent basis, an environment which would put their signing key at greater risk. Again they would not need to do this in this case if they fixed the authentication problem.

Apple keeps trying to construe this as a backdoor but it isn't really a backdoor. It is a security vulnerability in Apple's security architecture and it is fixable. If Apple keeps pushing the backdoor issue then Congress may legislate on backdoors or the courts may make a decision on them, and likely not in our favor in this case.

This is not the right battlefield for Crypto Wars 2.0. This is a high profile and emotionally driven case with a mass shooting in which much of the public will side with the FBI simply on this fact alone. The phone is the property of the government, the gunman is dead and all of these facts work in the FBI's favor. We should be fighting this on our terms and not the FBI's.

Rep. Sessenbrenner was correct when he told Apple's lawyer during the hearing that "you are not going to like what will come out of Congress on this". If any legislation is passed we will most likely be in a worse situation than we are in now. The best thing Congress could do on this issue is nothing at all. And Apple is pushing them to take action.


Edit: Even if Apple were to require authentication before updating the software it doesn't matter because of the third immutable law of security:If a bad guy has unrestricted physical access to your computer, it's not your computer anymore. The FBI can still load FBIOS directly into RAM since they have physical access. But this is a much more cumbersome procedure since they need to pull parts off the board and probably need to spin a new PCB and they can't automate it the same way as if they were to do it purely through code.


I feel like all of this talking about a "hard drive" that transports the tool around is a little specious—aren't Apples firmware updates signed? And the update itself would be IMEI locked?

The issue here isn't the reusability of the technology—the issue is the reusability of the legal precedent. And this is certainly a very scary precedent to set!


I think it's sort of about the reusability of the technology, because that's where the major barrier is. Forcing Apple to create that technology which undermines their own systems' security is at the center. If that technology already existed, the debate would only be about if the government should be allowed to use it, not whether it should exist.


Apples updates are signed, but they are not locked to a particular IMEI.


Right, but the FBI has indicated that they would like this particular one to be, to add some amount of code that would render it non-functional when put on another device. (and certainly from a security perspective that would be the correct choice to make)


"Comey told a congressional panel on Tuesday that a final court ruling forcing Apple Inc (AAPL.O) to give the FBI data from an iPhone used by one of the San Bernardino shooters would be “potentially precedential” in other cases where the agency might request similar cooperation from technology companies."

http://www.reuters.com/article/us-apple-encryption-congress-...

"Manhattan District Attorney Cyrus Vance testified in support of the FBI on Tuesday, arguing that default device encryption "severely harms" criminal prosecutions at the state level, including in cases in his district involving at least 175 iPhones."


Exactly—this completely agrees with my top level comment. Its not the technology we should be worrying about (although of course apple should be closing the vulnerability that allows the FBI to compel them to help decrypt this phone) but the more important—and scary—thing we should be worrying about here is the legal precedent that this case sets.


One important fact(1) i haven't heard in the entire hearing:

(2)The FBI wants 3 things: disable 'erase after 10', disable delay between attempts, create possibility to enter PINs programmatically so they can be brute forced.

(1) If this would be created, criminals/terrorist could easily defeat this by (instead of 6 digits) enabling alphanumeric passwords and creating strong passwords that cannot be brute forced (which isn't even a usability problem since you can easily unlock your iPhone with your fingerpint most of the time.)

So the end result of (2): security is lowered for 99% of all people (innocent) and criminals/terrorist (1%) can still easily avoid being caught by this measure.


If anyone watched the entire hearing today, Personally I was astounded by the misunderstandings of both the director and everyone asking the questions, it seemed that only one or two of them understood what Apple was trying to say in a technical sense.


Exactly, i was suprised about this also. Even at the end of the hearing the same 'stupid' questions were asked over and over, by which it became clear they still didn't understand the basics.


He seems to be suggesting that the FBI (and maybe law enforcement in general) is in charge of deciding what our civil liberties are. They've simply decided there's no right to create an unlockable lock, when there's clearly no law or part of the constitution that indicates that. But since they think they're in charge of civil liberties, they've decided by fiat.


I appreciate that Professor Landau emphasized the "arms race" between companies securing their systems, and adversaries breaking them. Companies push software with bugs, adversaries exploit the bugs (and hopefully responsibly disclose them), then the company patches the bug and pushes a new update.

Any iPhone <5 running iOS <8 is comically exploitable. This should drive home the point that as time progresses, older vulnerabilities become easier to exploit, so that leaving them unpatched becomes irresponsible.

If the FBI asks Apple to create new software to grant the FBI the ability to unlock the phone, they are effectively asking Apple to exploit a vulnerability in their software. By definition, Apple will know that vulnerability exists. In the "arms race," when Apple identifies a vulnerability, they fix it. In this case, when Apple identifies the vulnerability, will the FBI allow them to fix it? Or would the FBI prefer that Apple have a responsibility to "maintain" the vulnerability and ensure it remains exploitable?


I'd love to see a transcript [2], or list of participants [4] and find out which representatives present lean towards Apple or towards the FBI.

I skipped around the video. I mostly saw support for Apple, with a few exceptions, one being Mr. Sensenbrenner.

Mr. Sensenbrenner, a House republican of Wisconsin, asked Apple what legislation they would support, and Mr. Sewell, Apple's general counsel, said they support debate on the subject [1]

The congressman was clearly bullying Mr. Sewell here and using his position as congressman to make it appear as though Apple is not being agreeable. Well, encouraging debate before writing one-sided legislation sounds like great teamwork to me.

The congressman should have been reminded that not supporting new legislation is a valid position, and that every problem we face need not be solved with new laws.

Perhaps Mr. Sewell is trying to be respectful to the congressman. But I think he should not shy from responding in kind. Treat others as they treat you. No one will judge you for it. I think Apple missed an opportunity here to say they do not feel new legislation is needed, since guaranteeing back doors into devices would be an abridgement of consumers' right to privacy, and Apple's right to create safe products for consumers as it sees fit.

Overall though this seemed like a productive session.

Mr. Gowdy, a House republican of South Carolina, also bullies Sewell in the same manner [3], literally asking Apple to lobby for legislation.

In other words he is saying, get some lobbyists, do my work for me, pay for my next election campaign and solve your own problem. I don't know how you could be more blatantly obvious about being a corrupt, useless politician.

[1] https://youtu.be/g1GgnbN9oNw?t=3h59m30s

EDIT transcript here, though no speaker names are given and quotes are cut short

[2] http://www.c-span.org/video/?405442-1/hearing-encryption-fed...

[3] https://youtu.be/g1GgnbN9oNw?t=4h36m35s

EDIT list of participants:

[4] http://pastebin.com/raw/rHqYpv3g


Read that as FBI's Comedy



There is another choice which is to click the 'web' link under the title above which does a Google search for the title. Nearly always at the top of the Google search the article will appear and through that route you will get the entire article from many paywalled news sites, including WSJ.


WSJ closed this[1]; web link here finds the article in google which still hits the paywall.

[1]http://digiday.com/publishers/wall-street-journal-paywall-go...


This is fixed in Google Chrome by opening up the console in Web Enspector, and enabling Network conditions from the three dot menu and then in that new tab select Googlebot as the custom user agent. Note that you will still have to click on a google search link.


I wouldn't exactly call that a "fix", but still, thanks for the tip.


Once you've set it up once and figured it out it's easy to just to toggle it the next time. It's definitely more work. I prefer this to installing a random extension that can read all my data on all websites.


Interesting that this practice doesn't seem to affect the WSJ's rank in Google search results. Cloaking content based on user-agent is definitely a violation of Google's webmaster guidelines (https://support.google.com/webmasters/answer/66355?hl=en)


I remember they had some rule for google news that allowed paywalls if a user can read at least N articles per month from google search results. This is probably the same case.

https://support.google.com/news/publisher/answer/40543?hl=en...


works for me in an incognito window


This doesn't work anymore, give it a try and see.


Just worked for me


Also there is a choice not to visit sites that do not want you to read them.


Rather they want you to pay for a subscription to read their articles than not read them. Not everyone can afford $200 a year for a subscription to the WSJ. They are still using a newspaper business model that is based on subscribers rather than an Internet website business model of free with advertising.

Some sites refuse to serve content if an adblocker is installed.

So there is a lot of subscription models with some websites that try to force people to pay for articles. It is not that they don't want you to read, they want you to pay to read.


Thank you


Anyone have a non-paywalled version? Going through Google isn't bypassing the paywall.


You now have to open an Incognito window and then paste into Google search to bypass the WSJ paywall - I suspect they look at some login cookie to detect humans.


Already tried that, it doesn't work. I think WSJ is A/B testing, closing the loophole for some people and not others.


Strange, I used the "Wait! Google Sent Me" plugin and it worked without incog.


Its because you also need to turn off Ad Block or whitelist WSJ when using chrome Incognito


Interesting the link I have below is getting through the paywall for me.


I'm as strong a supporter of a citizen's right to encryption as anyone, but I actually think that Mr. Comey's testimony was accurate and forthright, and his framing of the issue of encryption as it relates to the capabilities of law enforcement was appropriate and well-reasoned.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: