Congressman: Court Order To Decrypt iPhone Has Far-Reaching Implications (dailydot.com) 400
Patrick O'Neill writes: Hours after Apple was ordered to help the FBI access the San Bernardino Shooters' iPhone, Rep. Ted Lieu (D-Calif.), a Stanford University computer-science graduate, wondered where the use of the All Writs Act—on which the magistrate judge based her ruling—might lead. "Can courts compel Facebook to provide analytics of who might be a criminal?" Lieu said in an email to the Daily Dot. "Or Google to give a list of names of people who searched for the term ISIS? At what point does this stop?"
Apple, so far, has vowed to fight the order that it decrypt the phone of San Bernadino shooter Syed Rizwan Farook, in no uncertain terms.
Apple, so far, has vowed to fight the order that it decrypt the phone of San Bernadino shooter Syed Rizwan Farook, in no uncertain terms.
Don't see the problem (Score:2, Insightful)
If you go through the legal process and get a court order that is the system working as intended. It's when they want backdoors and unregulated access to your information that it's a problem.
Re:Don't see the problem (Score:5, Insightful)
Re: (Score:2, Interesting)
the correct action would be to cooperate fully right now, and patch the back door. That way current case proceeds, and future similar situations are not feasible because the backdoor doesn't exist.
they'll have to open up a very public case "forcing" Apple to put in a back door, where apple would have a lot firmer leg to stand on as opposed to not cooperating with this investigation.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re: (Score:2)
>> Government to [Company]: "Also, develop it on your dime"
This is why businesses hate regulations in general.
Re: (Score:3)
Re: (Score:3, Informative)
Technically they have. They've asked for a system to remove the passcode limitations to allow brute force attacks.
This screams backdoor...
Re: (Score:2)
While this would be where we insert slippery slope argument, this is probably one of those rare places where slippery slope is best represented. Say they release a patch, patch finds itself in the wild, patch is reverse engineered, new patches can be used and built off of future versions of iOS.
Re: (Score:2, Insightful)
There's no need to talk about slippery slopes. This is already it. This is the government asking for a fully back doored version of iOS to exist. Once that happens, it's only a matter of time before some set of bad guys (be that overzealous gvmnt employees without a court order, or criminals wanting to break into any iPhone they pick up off the street) gets hold of the back doored version, and exploits it.
Re: (Score:3, Informative)
No such thing as a one time back door (Score:3)
But for this specific case only.
There is no such thing as a single case back door. Either the software is secure for all or it isn't secure at all. There is no middle ground here.
I don't understand why this can't be in a temporary fashion, specific to this particular iphone, and only for this specific case.
Because once you develop the software you can't un-develop it and it WILL be used again. The government is ordering Apple to develop what amounts to a backdoor. Apple is (very sensibly) fighting against this because it is a terrible idea with far reaching consequences. Once they develop the software then you can be certain as the sun rising tomorrow that the
All locks can be picked eventually (Score:3)
So by your very logic the software is already not secure - if it were, Apple wouldn't be able to retrofit a backdoor.
If the software does not currently exist to backdoor the device then it IS secure - for now. The fact that it might be possible to change that is a separate issue.
It's inherently insecure already and Apple are merely being asked to hack it.
This presumes that Apple can hack the device. It has not been conclusively established that this is possible. But let's presume that it is possible for argument's sake since if Apple can't do it then it isn't worthy of discussion. If Apple is able to hack into the device to retrieve the data then in theory the device is insecure in the same se
Re: (Score:2)
No, he has asked them to create a trojan firmware that can be loaded onto the phone as if it were a normal one. A trojan which could easily be reused on any phone, once the FBI get their grubby hands on it, and then there will be jack all that anyone can do about it.
They are asking Apple to do something which will effectively give them this same access in the future without a warrant, because the process will be repeatable. Even if Apple loads the rom themselves, the FBI will make a copy. They will expend g
Re: (Score:2)
The one thing that might stop them is if Apple uses their firmware signing stuff, and then revoke the signing afterward. But that would only be a temporary stop until someone at FBI / NSA figures out how to man-in-the-middle the signing mechanism within their lab.
Re: (Score:3)
It is not clear to me that this is going to be done with trojan firmware merely uploaded. It's conceivable that Apple might have to fab a custom IC for this, scrape epoxy off of the existing IC and board to replace the IC, etc. If the FBI's problem can be "fixed" with mere software, then Apple's product wasn't any good anyway, and it was already waiting for any and all attackers to come "fix" it. Somehow I suspect that's not the case.
One of the reasons they might be complaining about the order, is that it's
Re:Don't see the problem (Score:4, Insightful)
the correct action would be to cooperate fully right now, and patch the back door. That way current case proceeds, and future similar situations are not feasible because the backdoor doesn't exist.
they'll have to open up a very public case "forcing" Apple to put in a back door, where apple would have a lot firmer leg to stand on as opposed to not cooperating with this investigation.
The problem is the FBI then have this version of iOS with stripped out security that they can then theoretically install on any iphone they want to grab all the data. They say it will only be used this one time for this one thing but if you believe that there's a lovely bridge I have for sale.
Re: (Score:3)
This is an iPhone 5c. The secure enclave wasn't added until the iPhone 5s. Even if it existed, as I understand it, it is possible to read both the secure enclave's NAND flash and its hardware ID using electron microscopy.
Note, however, that obtaining that data would likely involve completely destroying the secure enclave (uncapping the chip and removing layer
Re: (Score:2)
The court has ordered Apple to create a new OS for the iphone that can be placed on this phone (and any other iphone) to bypass the security features. Would you be OK with the court ordering a local mechanic to repair a totaled car so that the police could recreate what happened in a car crash?
This is not just a matter of ordering Apple to turn over something they have in their possession. It is an order requiring Apple to devote howe
Re: (Score:3)
The last bastion of liberty is the defiant citizen saying "no" to government overreach. However, when sufficient numbers of people become compliant to the whims of the ruling elite, tyranny is at hand.
Regardless of the circumstances of the case, I admire Tim Cook's stance. My only hope is that he remains standing firm on those convictions.
Re: (Score:2)
You realize this is nothing new, right? Telephone companies for ages have been required to install mechanisms that allow law enforcement the ability to tap your line. The same has existed for many ISPs & online services as well.
The whole Microsoft battling the US government over data stored in Ireland is just one notable example of this.
But "reasonable" is a nebulous term. (Score:3)
What the court (and the idiot sitting in the big chair) see as "reasonable" may not be reasonable to Apple, or anyone else.
Additionally, such a nebulous term is horribly susceptible to "moving goalposts". Apple decides to cave, gets so far, and doesn't think it reasonable to go any further. But now that they've caved, the government and the idiot in the big chair come back with "Well, you've gotten THIS far, you may as well see it through!"
The appropriate answer to this is "There is no technical way to do
Backdoors for everyone (Score:2, Insightful)
If you go through the legal process and get a court order that is the system working as intended.
Not when the court doesn't really understand the full ramifications of what they are ordering. You can have due process and end up with a terrible ruling if the court is clueless. Hopefully it will be sorted out in due course. Apple is clearly correct in their position as far as I can tell.
It's when they want backdoors and unregulated access to your information that it's a problem.
In this case the court is apparently ordering Apple to CREATE a backdoor since one supposedly does not currently exist. This is a terrible idea for reasons too numerous for me to mention here. You cannot create a bac
Re: (Score:2)
I think the argument that Apple is making is that they are being ordered to build something that doesn't exist and/or offer a service that is outside their normal business practices. It's like forcing conscientious objectors to kill people; or forcing citizens to give room and board to soldiers; or forcing a typewriter manufacurer to make firearms; or forcing a chemical manufacurer to make "death gel".
I could certainly see this going to the supreme court.
Re: (Score:2)
The court order is to create a version of iOS that has a back door, which could then be loaded onto the device. Do you really think the US Government would use this one time, and then destroy all copies of it, or do you think that a copy might get "accidentally" transmitted to the CIA / NSA and used whenever the fuck they want?
Re: (Score:3)
It's a phone. State actors already have that capability. (although, in this case, the US apparently doesn't, or they're trying the easy way first). It all hinges on that 4-digit PIN. If this is some sort of smartcard, then Apple can't help you either. If this is software based, you can help yourself: just copy the contents of the phone to another medium, have an implementation of the crypto, and run your brute-force from there. With 10000 combinations, it should take about a minute.
Re:Don't see the problem (Score:5, Funny)
Gee, and the FBI didn't think of this? You should tell them, I'm sure they'd be pleased to get this information.
Re: (Score:2)
The key that needs decrypting can't be copied from the hardware, at least without several (hundred) million in specialized equipment.
Re: (Score:3, Informative)
You can't do the latter. The encryption key is split into fragments, baked into silicon in multiple chips, and the exponential timeout and wipe features are enforced by hardware.
iOS may be a walled garden, but damn, the walls go down to the bedrock.
See apple's own ios 9 security whitepaper [apple.com] or this fellow's succinct summary [techdirt.com] buried deep in the comments on techdirt.
Re: (Score:3)
That analysis would be correct if this were an iPhone 5s or later. It isn't. It's an iPhone 5c, which doesn't have a secure enclave, making the entire analysis incorrect.
Re: (Score:3, Informative)
It's hardware based encryption, where half of the key comes out of a value burned into the CPU during manufacturing (and not recorded anywhere) combined with a value burned into the "Secure Enclave" during manufacturing (and not recorded anywhere). You take the storage image off the device, you lose half the key and you're fucked. You attempt to crack the PIN on device, you get 10 tries before the secure enclave overwrites the key with a new one, and you're fucked. If the auto-wipe was disabled by the us
Re: (Score:3)
It's a precedent they're trying to set. Of course they have the tools to do this themselves with a brute force attack, that's not the point. If they can force companies to provide the backdoor then they no longer need to use the appropriate channels to get the information. They'll have the ease of use that anyone at the FBI who wants access to the data can easily get that access, without drawing any unwanted scrutiny. They're trying to backdoor the process as much as the hardware/software.
Re: (Score:3)
That's true on the iPhone 5s and later. This is an iPhone 5c, which lacks a secure enclave. AFAIK, the key is stored on the external flash part, and I think the passcode is entangled with a hardware ID, but that can be obtained by the OS (by definition) without knowing the passcode. So all that is necessary to crack one of these is to remove the code that wipes the key upon a certain number of failed attempts, along with the code that progressively increases the time between failed attempts.
And really,
Re: (Score:2)
Or a vulnerability.
Same effect, more precise terminology.
Re: (Score:3)
However the law (as well as the judge) compels Apple to assist with law enforcement investigations.
While Apple has designed things so far in a way that makes it rather hard for the government to unlock a phone after it's been seized, one has to wonder at what point they attempt to compel Apple to deploy a modified version of the OS (without all of the safeguards) to subjects who are under investigation... and with a warrant in hand.
Re: (Score:2)
What if a non-US government orders it on a US citizen's phone ?
Then it's a state of war between good old US and that country.
Only the mighty US can impose its will upon other countries and their citizens. Didn't you get the memo?
Re: (Score:3)
Being pedantic, but there is no 'sufficiently complex' about a one-time pad. Either the pad is actually random or it is not, and either it is used properly (ie never reused), or it is not.
If it is actually random and never reused, the cipherhtext is beyond virtually unbreakable -- it absolutely unbreakable. The message can be decoded with equal probability into anything.
seed from 8 bytes to 24 bytes to be reasonably secure.
This is not a OTP, it is something else. Your seed is the input to a pseudorandom number generator which is creates the encryption key. Si
Re:Don't see the problem (Score:5, Informative)
Knowing that a former Secretary of State operated their own email server in a manner that a fairly knowledgeable system administrator would recognize as vulnerable to the known capabilities of state-sponsored attempts to compromise it and extract the contents, it's almost disingenuous for the government to claim security is both essential and working at the highest levels, when they knew or should have known that a Cabinet officer was subverting that security. They just were. Reasonable people and those skilled in the art cannot avoid coming to that conclusion baaed on the publicly known evidence.
Our government isn't very good at protecting our rights, nor at its own operations. Good enough reason to limit our government to essential activities only.
And I pray Apple actually tries to break their own encryption and fails. Security shouldn't be reserved to the few. In a nominally free society we will not have perfect security, but we will have, hopefully, more freedom than not.
Don't overlook the Office of Personnel Management data breach, in which the OPM had such bad security that they effectively released to hackers the entire collection of background check information for all government personnel and contractors who need access to gov't facilities for everyone who filled out the forms from about 2000 to 2015. It wasn't just the form data (name, SSN, lists of associates to use for references, foreign travel history) - it was all the follow up data, too. Including responses from references, clearance interview details. It even included images of fingerprints if you went through the process since the PIV-II cards came into use. All of that information is now basically free on the internet. Forever. It's a phisher's (and foreign extortionist's) wet dream-- a complete set of collated, validated data, including associations and relationships, as well as potential dirt, on everyone who has worked for the US gov't (including many many contractors) for the past 15 years.
Re: (Score:3)
That aside, you're still missing the underlying distinction. While I agree
They aren't ordering Apple to decrypt it (Score:5, Informative)
Re:They aren't ordering Apple to decrypt it (Score:5, Insightful)
This is the only good explanation I've seen of what the order is about:
https://www.techdirt.com/artic... [techdirt.com]
As long as Apple can install a signed update on the device without decrypting it first, this will be possible. They need to remedy that quickly.
Yes, Apple has all along insisted that they can't break the encryption on the phone. But the FBI apparently knows they can and wants them to do it. That means there is already effectively a back door and they just need Apple to sign the software update. So Apple has been lying.
Re: (Score:3)
I think Apple *could* crack one of their own devices given enough time and motivation. No system is 100% secure. Apple has all the design details and source code, plenty of examples to test against, and the people or means to hire the people who could come up w
Re: (Score:3)
It'll have to go to the Supreme Court for that verdict, though.
I have no faith in either the (R) (Trump/Cruz/Rubio/Bush ..) or Obama to appoint someone that would rule in favor of privacy/security over government intrusion.
Quite frankly, people who say there is a difference between the two parties, need to wake up and smell the coffee.
Re:They aren't ordering Apple to decrypt it (Score:5, Informative)
It's worth noting that this wouldn't work on any device that has a Secure Enclave and TouchID. The Secure Enclave can't be updated or tampered with without it erasing its keys, leaving the phone permanently encrypted. If the SE isn't modified, it imposes delays on responding to passcode requests so that after the 9th request, there's a delay of 1hr before you can try again.
This article goes into more detail: http://blog.trailofbits.com/20... [trailofbits.com]
The phone in question is a 5c, so yes, Apple could theoretically do what is being asked of them. But the phones after that--no.
Re: (Score:3)
For Apple to do this would open up a whole new can of worms. Similar orders would rise up f
Re: (Score:3)
The owner of the phone is dead. Reading TFA does, indeed, sometime reveal relevant information.
Re:They aren't ordering Apple to decrypt it (Score:5, Funny)
Then they should try waterboarding the body.
Re: (Score:3)
Re: (Score:2)
Don't they have to force the installation? It's not like they can open the phone and accept a download...
Re: (Score:2)
Last I checked, the State isn't Apple and Brute Force isnt a Back Door.
Re: (Score:2)
They should have built the wipe-after-10-failed-attempts functionality into the hardware. Most secure storage subsystems have some kind of dedicated processor and ROM (i.e. can't be updated, it's burned into the silicon) that enforces access control to the secure memory. Often the functionality to wipe when a certain number of incorrect attempts is made is built in to that sub system, and thus can't be overridden or bypassed by a software update, or even turned off.
Re: (Score:2)
One imagines that the iPhone7 or more likely the iPhone8, as one suspects the iPhone7 design is too far advanced to add this will come with such a feature.
Quite what the FBI are thinking I don't know.
Re: (Score:3)
It is built in, now. The iPhone 5C is the last model without the "Secure Enclave" which stores the key, arbitrates all access to it, and wipes it after 10 attempts unless told not to.
Some of your questions may already have answers. (Score:4, Informative)
>> "Can courts compel Facebook to provide analytics of who might be a criminal?...Or Google to give a list of names of people who searched for the term ISIS?
Facebook already publishes a guide for law enforcement: https://www.facebook.com/safet... [facebook.com]
Google does too: https://www.google.com/transpa... [google.com]
All Writs Act of 1789 (Score:5, Informative)
'The All Writs Act [wikipedia.org] is a United States federal statute, codified at 28 U.S.C. 1651, which authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.'
"On October 31, 2014, the act was used by the U.S. Attorney's Office in New York to compel an unnamed smartphone manufacturer to bypass the lock screen of a smartphone allegedly involved in a credit card fraud."
Looks like there is a precedent. Mind you Apple has lots of money for lawyers to make sure this doesn't happen.
Thanks Apple (Score:5, Insightful)
I'm not an iPhone user but I appreciate you standing up for people's privacy. I have a better chance of winning the lottery than dieing at the hands of a terrorist. Why would I want to lose my privacy over those odds.
Re: (Score:3)
I have a better chance of winning the lottery than dieing at the hands of a terrorist. Why would I want to lose my privacy over those odds.
One of those odds is an individual risk. The other is a systemic risk that can also end up in the death of many of your family, friends, coworkers and bystanders. So relating the two odds sounds statistically intelligent, but is really a form of systemic risk blindness.
Re: (Score:3)
In that case, the risk of you or your family dying in a motor vehicle accident--the same kind of systemic risk that you're talking about--is orders of magnitude higher than the risk of dying in a terrorist attack. Similarly with dying from domestic gun violence. That's with laws in place to mitigate those risks. The reality is that terrorism is a really marginal cause of death, and to spend too much time and money worrying about it (certainly any time and money beyond the enormous amounts we're already spen
No uncertain terms? (Score:3, Insightful)
If I read Apple's "customer letter" correctly, they very well have the ability to create the software that is demanded of them, and decrypt that phone. Whether that software already exists or not is immaterial. If it is possible to create the software and use it on existing devices, then for all intents and purposes the backdoor is already there. Apple just doesn't want to open it, because they rightly fear losing the trust of their customers - trust which, following this interpretation, is unfounded.
Re:No uncertain terms? (Score:5, Insightful)
Basic question (Score:2)
I have a basic question. The phone in question is encrypted. How is Apple supposed to un-encrypt it without the key, which they state they do not have? Apple goes on to say, that for the future, they would have to create a back door program to do that which is an entirely different discussion. So, unless the backdoor already exists (and Apple wants to keep it secret), how can this "request" be fulfilled?
Re: (Score:2)
Re: (Score:2)
Answer: the court order is for Apple to write such a backdoor program.
Yes, I know, but such a program won't work retroactively to unencrypt the already encrypted data because Apple doesn't have the key. It would only work going forward. As such, the government's appeal to the courts is bogus. The back door still won't get them the unencrypted data.
Put differently, the government already has the data from the phone in question. To access the data in unencrypted form, there is nothing that Apple can do that they, themselves cannot do. If Apple acquiesced, then for future case
a lot of news (Score:2)
A lot of news and commentary on this one.
https://www.washingtonpost.com... [washingtonpost.com]
http://www.usatoday.com/story/... [usatoday.com]
http://www.nbcnews.com/storyli... [nbcnews.com]
Is what the FBI ask Apple feasible, or not ? (Score:2, Insightful)
There is something that does not add up in Apple's discourse at http://www.apple.com/customer-... [apple.com]
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor.
I read what the FBI asks as: install a piece of code that allows the phone's content to be examined. I see no middle ground between
1) running such piece of code (probably: after getting it signed by Apple) is possible without the owner's passcode; the iPhone is in fact already backdoored, with Apple holding the key, the FBI wants Apple to exploit the vulnerability/open the backdoor, and Apple does not want to b
Tim Cook's wrong you know (Score:2)
2. Seems to me the SDK would include a way to dump the contents of a phone and a simulator. So it's just a simple script to spin up a virtual image, enter a passcode and repeat until the phone unlocks. Are you telling me the Apple developer's kit doesn't have that functionality? I bet the NSA developer's kit does.
Because politicians believe in magic... (Score:5, Insightful)
There Is No Such Thing as Magic. If there is a known backdoor, it will be found and exploited. This can't be prevented, and honestly (Take not, politicians)...
That means that the content on anyone's phone can be stolen. Not just anyone's phone, but the phone of every politician in the world.
Be careful what you wish for.
Re: (Score:3)
That means that the content on anyone's phone can be stolen. Not just anyone's phone, but the phone of every politician in the world.
If politicians want to put a back door on our phones, those politicians need to use those same phones. [csmonitor.com]
Which brings up another point. The US Federal Government can hack the German Chancellor's Iphone, [bbc.com] but not the Iphone of some nut in California?
Re: (Score:3)
Which brings up another point. The US Federal Government can hack the German Chancellor's Iphone, but not the Iphone of some nut in California?
Presumably the Chancellor's phone was hacked while it was running. Just like using full disk encryption doesn't protect you from getting a trojan while surfing, but if your laptop is turned off and gets stolen it will. It's only a strong lock on the door, it doesn't prevent nasty things from slipping in when it's open.
"Bad cases make good law" (sometimes) (Score:5, Informative)
Re: (Score:3)
They can't do it to my iPhone, or any iPhone Apple is currently selling. Mine is the version after the one in the case, and is not vulnerable to that attack. It's probably vulnerable to something else, being a complex piece of technology, but that's how life works.
Apple post makes it clear it is possible? (Score:2)
The Apple post seems to make it clear that this attack is doable. This implies that the San Bernadino phone is set to gobble up an update signed by Apple or something. This sounds like Apple is fully capable of creating a signed malicious operating system into the phone, and is now refusing to do so. Nothing in the article implies that this is *impossible*, merely *really unwise*.
Of course it is unwise. But the fact that it is POSSIBLE belies a second security flaw- that installing a new OS is possible
Find myself in a moment of Apple Admiration (Score:3)
Well, one Apple-Hater hates a little less.
What this (probably) means to you! (Score:5, Insightful)
The order implies that Apple is capable of delivering a remote update, or that forcing an update locally is possible if you have physical access. It also implies that portions of the security models are enforced by software that is vulnerable to "update", such as the wipe-after-ten-tries (presumably that code will be replaced with a no-op) and the code entry delay in excess of that which is enforced by hardware.
Whether Apple is compelled to do this or not, the natural concern is "well how much of my data is shielded by math, how much by hardware, and how much by software"?
You can't bargain with math, you have a devil of a time working out hardware, and software along is meaningless as a defense.
It appears that your best bet for security is either:
1)- A multi-character password that is easy to enter (and you'll remember it if its your phone password, lol), but reasonably short. This is if you trust that the 80ms hardware delay can't be broken. This precludes the use of 4 and 6 digit PINs, as a 4 digit PIN will usually fall after a few minutes of this treatment, and a 6 digit PIN after around half a day. An 8 digit password consisting of a completely random set of just the visible lowercase letters (aka, no actual english words) at this rate is hundreds of years, and adding stuff that's harder to enter quickly (capitals, numbers, special characters) makes it much more secure, as does lengthening the password slightly. The challenge here is that passwords are usually chosen to be words, greatly reducing the entropy. And again, this assumes that the 80ms hardware delay is not defeatable.
2)- A fully secure crypto passhprase. This is the level of drama you would go through to password protect a drive or something you take very seriously, and as such it would be a lot more than 8 characters. Your passphrase is long, contains several unpredictable parts, and makes use of more than just a statistically predictable subset of words and characters. You can set this on the iphone, of course, but this kind of protection is not trivial to type in. In this case, you are trusting the math only, however, and assuming that the software will be compelled by the government, and the hardware will be owned by a team skilled in this matter.
Going forward, Apple should probably move the "erase after 10 tries" into the secure portion of the phone, such that it has a protected portion that can't be overwritten without access to the PIN. This will also make them immune to this sort of order in the future.
Re: (Score:3)
or
3) A two-level system. A simple PIN that unlocks the phone enough to make calls, send texts, take pictures, etc., and provide access to the last X hours of call/text/pic/etc. history. Going back any further in time would require a more complex passphrase. Apps could be grouped into ones usable with PIN, and ones only usable with the passphrase. A panic button on the lock screen would flush all data to the more protected area, and require the more complex passphrase to unlock anything.
Everyone should have Apple's back on this! (Score:3)
Apple is in the right and Google and Microsoft should be backing Apple on this one.
And I say this as a long time hater of all things Apple. I own nothing from Apple. Never have. I fucking detest Apple. Look at any post of mine here on /. for almost 20 years and you will see that this is the very first post I have ever made here saying ANYTHING positive about Apple.
But they are so in the right on this one. Our devices are our own personal data archives on a level way beyond our houses or safes. It is an extension of our brains. While we may not have the technology just yet, imagine if a court ordered you to have your personal memories residing in your physical brain extracted. This is the same thing. People put info on their phones with a full expectation of privacy, regardless of if that expectation is realistic. If you think it's acceptable for the government to demand access to your most personal inner sanctum of being (your brain), then a brain extension like a personal phone or computer should also be inviolate.
This is so completely disgustingly wrong of anyone in government to expect such a thing, for any reason. Even for this reason. Extra ludicrosity because this is an after-the-fact demand, no matter what happens this will not bring one person back from the dead.
If you are about freedom in any way, then you should be completely against this horrid precedent ever being set. Your thoughts are your own, always.
It's realy nice they're letting Apple fight it (Score:5, Insightful)
I presume that some congressman pushed the FBI to make this request out in the open just for the purpose of fighting it in court. All in all it's a good thing. Defending civil rights and all that.
But if the FBI ACTUALLY wanted this information they would have simply given Apple a gag order along with it. Or asked the NSA to do that for them. It's even their purpose, fighting terrorism, right? This falls SQUARELY under the domain of shit they've strong-armed and gagged companys into helping them with. The fact that we're even hearing about it has to be some sort of process manipulation.
Re: (Score:2)
If they stuck to specific warrants like they were supposed to, people would have less problem.
As for your idea, there are billions around the world who do not share your joy at government officials being able to read their stuff, limited only to their self-decided limits of appropriateness, like Russia, China, and the Mid East.
We must forbid building the 1984-like tools of tyrrany.
How Many Combinations? (Score:2)
Nine quadrillion, nine hundred trillion combinations...if I worked that web calculator correctly.
Perhaps they should try 1,2,3,4.
Re: (Score:2)
Nine quadrillion, nine hundred trillion combinations...if I worked that web calculator correctly.
Perhaps they should try 1,2,3,4.
As others have pointed out, there are only 10,000 combinations using a four digit numeric pin. Of course, with an iphone, you only get to try ten of them incorrectly before it wipes the phone.
Re: (Score:2)
Nine quadrillion, nine hundred trillion combinations...if I worked that web calculator correctly.
Perhaps they should try 1,2,3,4.
As others have pointed out, there are only 10,000 combinations using a four digit numeric pin. Of course, with an iphone, you only get to try ten of them incorrectly before it wipes the phone.
I have a 7 digit PIN on my iPhone. The default is 4, but you can have up to 50 AFAIK.
Re: (Score:2)
Re: (Score:2)
If Apple was telling the truth, the court order should not matter. Apple has already claimed that they cannot decrypt the phone.
That was my thought -- unless Apple isn't telling the truth.
Re:Unless Apple Lied (Score:4, Informative)
The court isn't asking for Apple to decrypt the phone, but for them to provide a special signed firmware that disables certain features meant to protect the encrypted data against brute-forcing.
If the crypto is up to snuff and a strong key was used then brute force will fail anyway, so I don't understand why this is such a big deal to Apple.
Re:Unless Apple Lied (Score:5, Insightful)
It's a big deal because complying with *any* request to modify software for use of LEA now will mean that they (and other manufacturers) will have to comply with *all* requests to modify software in the future. In the eyes of the law there is no difference in what technical capability is being implemented, only that some sort of technical capability can be implemented at the direction of LEA. Once open, this door cannot be closed.
Re:Unless Apple Lied (Score:5, Insightful)
And just to pound the point home, both are true:
Once the legal door has been opened (it becomes OK to require companies build back doors)...
Once the technical door has been opened (backdoor to firmware)...
Open either door and there's no closing them. What's truly ironic is there was a huge uproar a year or so about backdoors in network gear coming out of china ... and now the US is literally asking for the same thing to be created for them.
Re:Unless Apple Lied (Score:4, Insightful)
And, the US (and US made products) will irrevocably cease to be trustworthy.
Once the US does this, everyone in the world MUST assume these companies have built this in, that the US government can access it, and that Apple will be forced to roll over for any other government.
I'm not sure people understand just how much of a global clusterfuck of undermining rights and freedoms the US is doing here -- it's time to stop pretending to be champions of freedom and liberty when you have actively decided to do the opposite.
If Apple caves on this, every piss-pot dictator will insist on the same access.
What the FBI is demanding is full Big Brother status.
Re: (Score:3)
The technical door has been closed. This trick won't work on an iPhone 5S or later. There are doubtless other vulnerabilities, but the hardware security improvements on the more recent iPhones are impressive.
Now, my privacy technique is partly to be so boring no FBI agent will look through all my private stuff anyway, but that's not for everyone.
Re: (Score:3)
The law says "reasonable", and asking for perfect code that's actually useful is unreasonable. Moreover, since Apple doesn't have to do it on their own dime, and getting near-perfect code requires NASA levels of expense, even the FBI is likely to get discouraged.
In this case, it's easy to see whether the code works: if the FBI runs through the possible PINs, and can do so in reasonable time without the encryption key getting wiped, the software worked.
In general, the court system doesn't produce stup
Re: (Score:2)
What I heard was that the key is required to be a 4-6 length pin. So not only do we know that it is either 4, 5, or 6 characters long, but all characters are 0-9. Which, is ridiculously weak.
That is only slightly over 1 million possible passwords. It would probably take under a second to brute force this password.
Compare this to the general accepted wisdom of 8 alphanumeric characters with punctuation and other special characters. Which comes out to 95^8, or 6 million billion passwords. or 6*10^9 times bett
Re: (Score:2, Informative)
You obviously didn't bother to read Apple's response to this. They are not asked to decrypt the phone, they have been asked to 1.) remove firmware protection that wipes the device after 10 unsuccessful access attempts, if enabled, and 2.) provide firmware that somehow circumvents the built-in progressive delays so that brute forcing the password is possible by hooking up some device to the phone.
Basically, they are asking Apple for custom firmware/OS that renders security features useless that would make at
Re: (Score:2)
Re:Unless Apple Lied (Score:5, Interesting)
No, they claimed that they cannot decrypt their current phones (i.e. ones with TouchID + Secure Enclave running iOS 8 or 9). This model is an iPhone 5c (i.e. three generations old) that lacks the protections of their current phones and thus is susceptible to the sort of attack being proposed by the FBI. Apple has pointed out that complying with this order would jeopardize their business by making it seem as if they're lying about the security of their current phones, since the public won't understand the distinction. Your comment is evidence in support of their concern.
Re:Shielding murderers and the accomplices (Score:5, Insightful)
This isn't just about two terrorists.
Once Apple complied and build the tools necessary, the tool can and will be used elsewhere.
And what the LEOs don't understand or willfully ignore, is that if a backdoor exists, pretty much everybody can use it. If Apple creates this modified firmware for the US government, other governments around the world will demand access, too. And sooner or later, this firmware will get in the hand of non-government actors with criminal intend, too.
Re: (Score:2)
Re:Old southern saying (Score:4, Funny)
Can't put the shit back in the horse
Sounds like you haven't done much surfing on the net.
Re: (Score:2, Insightful)
Why would Apple want to shield the communications of mass murderers and their accomplices whom the FBI is trying to track down?
Mass murderers? Has someone been convicted?
What we have here are people being accused of murder. To my knowledge no one has been convicted. So let's not go throwing out the presumption of innocence just because you saw something on TV.
Re: (Score:2)
If it the "Secure Enclave" is done correctly it's implemented in hardware and can never ever be changed. If it is currently not done like this one imagines that the iPhone 7 or possibly iPhone 8 dam well will have it implemented in hardware and close the loophole that the FBI are trying to use.
Of course give Apple are going to "fight it" one presumes that it is not currently implemented in hardware, because otherwise they could just turn up to court with some of their engineers who would under oath explain
Re: (Score:3)
Re:Preaching to the choir (Score:4, Insightful)
You joke, but many people there are actually saying things like this. I see comments calling for Tim Cook to be charged with treason, saying Apple shouldn't be able to do business in the U.S., etc.
To be fair, you see these same exact comments on Slashdot; just for different reasons.