Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×
Privacy Communications Encryption Government United States

Congressman: Court Order To Decrypt iPhone Has Far-Reaching Implications (dailydot.com) 400

Patrick O'Neill writes: Hours after Apple was ordered to help the FBI access the San Bernardino Shooters' iPhone, Rep. Ted Lieu (D-Calif.), a Stanford University computer-science graduate, wondered where the use of the All Writs Act—on which the magistrate judge based her ruling—might lead. "Can courts compel Facebook to provide analytics of who might be a criminal?" Lieu said in an email to the Daily Dot. "Or Google to give a list of names of people who searched for the term ISIS? At what point does this stop?"
Apple, so far, has vowed to fight the order that it decrypt the phone of San Bernadino shooter Syed Rizwan Farook, in no uncertain terms.
This discussion has been archived. No new comments can be posted.

Congressman: Court Order To Decrypt iPhone Has Far-Reaching Implications

Comments Filter:
  • by Anonymous Coward

    If you go through the legal process and get a court order that is the system working as intended. It's when they want backdoors and unregulated access to your information that it's a problem.

    • by Errol backfiring ( 1280012 ) on Wednesday February 17, 2016 @09:54AM (#51526859) Journal
      And if you read the article, you see that Apple states that this is a backdoor.
      • Re: (Score:2, Interesting)

        by Anonymous Coward

        the correct action would be to cooperate fully right now, and patch the back door. That way current case proceeds, and future similar situations are not feasible because the backdoor doesn't exist.

        they'll have to open up a very public case "forcing" Apple to put in a back door, where apple would have a lot firmer leg to stand on as opposed to not cooperating with this investigation.

        • Re: (Score:3, Insightful)

          by Holi ( 250190 )
          Cooperating fully would mean implementing a backdoor, not patching one that already exists. Kind of kills your line of reasoning.
          • Another analogy (Score:3, Insightful)

            by bangular ( 736791 )
            Government to Apple: "Develop the atom bomb. It will only be used just this once and then you can throw away the technology. Also, develop it on your dime."
            • >> Government to [Company]: "Also, develop it on your dime"

              This is why businesses hate regulations in general.

            • Also, develop it on your dime

              Bullshit. Apple is allowed to charge reasonable rates to comply with the Writ. And reasonable rates can be determined by something as simple as cost of the employees' time and any materials required, plus overhead. They'll even recoup their electricity for the computers. They'll probably make a nominal profit on the work.

        • by stealth_finger ( 1809752 ) on Wednesday February 17, 2016 @11:43AM (#51527559)

          the correct action would be to cooperate fully right now, and patch the back door. That way current case proceeds, and future similar situations are not feasible because the backdoor doesn't exist.

          they'll have to open up a very public case "forcing" Apple to put in a back door, where apple would have a lot firmer leg to stand on as opposed to not cooperating with this investigation.

          The problem is the FBI then have this version of iOS with stripped out security that they can then theoretically install on any iphone they want to grab all the data. They say it will only be used this one time for this one thing but if you believe that there's a lovely bridge I have for sale.

    • You do not see the problem with the government co-opting your labor?

      The court has ordered Apple to create a new OS for the iphone that can be placed on this phone (and any other iphone) to bypass the security features. Would you be OK with the court ordering a local mechanic to repair a totaled car so that the police could recreate what happened in a car crash?

      This is not just a matter of ordering Apple to turn over something they have in their possession. It is an order requiring Apple to devote howe
      • The last bastion of liberty is the defiant citizen saying "no" to government overreach. However, when sufficient numbers of people become compliant to the whims of the ruling elite, tyranny is at hand.

        Regardless of the circumstances of the case, I admire Tim Cook's stance. My only hope is that he remains standing firm on those convictions.

    • by sjbe ( 173966 )

      If you go through the legal process and get a court order that is the system working as intended.

      Not when the court doesn't really understand the full ramifications of what they are ordering. You can have due process and end up with a terrible ruling if the court is clueless. Hopefully it will be sorted out in due course. Apple is clearly correct in their position as far as I can tell.

      It's when they want backdoors and unregulated access to your information that it's a problem.

      In this case the court is apparently ordering Apple to CREATE a backdoor since one supposedly does not currently exist. This is a terrible idea for reasons too numerous for me to mention here. You cannot create a bac

    • by wcrowe ( 94389 )

      I think the argument that Apple is making is that they are being ordered to build something that doesn't exist and/or offer a service that is outside their normal business practices. It's like forcing conscientious objectors to kill people; or forcing citizens to give room and board to soldiers; or forcing a typewriter manufacurer to make firearms; or forcing a chemical manufacurer to make "death gel".

      I could certainly see this going to the supreme court.

    • The court order is to create a version of iOS that has a back door, which could then be loaded onto the device. Do you really think the US Government would use this one time, and then destroy all copies of it, or do you think that a copy might get "accidentally" transmitted to the CIA / NSA and used whenever the fuck they want?

  • by hawkeyeMI ( 412577 ) <brock@nOSpaM.brocktice.com> on Wednesday February 17, 2016 @09:58AM (#51526885) Homepage
    This is the only good explanation I've seen of what the order is about: https://www.techdirt.com/artic... [techdirt.com] As long as Apple can install a signed update on the device without decrypting it first, this will be possible. They need to remedy that quickly.
    • by bigpat ( 158134 ) on Wednesday February 17, 2016 @10:19AM (#51526999)

      This is the only good explanation I've seen of what the order is about:

      https://www.techdirt.com/artic... [techdirt.com]

      As long as Apple can install a signed update on the device without decrypting it first, this will be possible. They need to remedy that quickly.

      Yes, Apple has all along insisted that they can't break the encryption on the phone. But the FBI apparently knows they can and wants them to do it. That means there is already effectively a back door and they just need Apple to sign the software update. So Apple has been lying.

      • by c ( 8461 )

        But the FBI apparently knows they can and wants them to do it.

        ... and the FBI has provided proof of this to the court? I'm fairly certain that if the FBI had proven that Apple actually had a known backdoor the news spin would be very, very different.

        I think Apple *could* crack one of their own devices given enough time and motivation. No system is 100% secure. Apple has all the design details and source code, plenty of examples to test against, and the people or means to hire the people who could come up w

        • It'll have to go to the Supreme Court for that verdict, though.

          I have no faith in either the (R) (Trump/Cruz/Rubio/Bush ..) or Obama to appoint someone that would rule in favor of privacy/security over government intrusion.

          Quite frankly, people who say there is a difference between the two parties, need to wake up and smell the coffee.

      • It's worth noting that this wouldn't work on any device that has a Secure Enclave and TouchID. The Secure Enclave can't be updated or tampered with without it erasing its keys, leaving the phone permanently encrypted. If the SE isn't modified, it imposes delays on responding to passcode requests so that after the 9th request, there's a delay of 1hr before you can try again.

        This article goes into more detail: http://blog.trailofbits.com/20... [trailofbits.com]

        The phone in question is a 5c, so yes, Apple could theoretically do what is being asked of them. But the phones after that--no.

    • My first thought is that the owner should be compelled to unlock their phone. They have reasonable evidence that these folks are guilty of something, so I don't think the 5th amendment would slide here... Anything they find not directly correlated to the crimes they are charged are inadmissible, after all. The owners refusal to cooperate should just levy additional charges, like obstruction of justice or whatever.

      For Apple to do this would open up a whole new can of worms. Similar orders would rise up f
    • Don't they have to force the installation? It's not like they can open the phone and accept a download...

    • Exactly.


      Last I checked, the State isn't Apple and Brute Force isnt a Back Door.
    • by AmiMoJo ( 196126 )

      They should have built the wipe-after-10-failed-attempts functionality into the hardware. Most secure storage subsystems have some kind of dedicated processor and ROM (i.e. can't be updated, it's burned into the silicon) that enforces access control to the secure memory. Often the functionality to wipe when a certain number of incorrect attempts is made is built in to that sub system, and thus can't be overridden or bypassed by a software update, or even turned off.

      • by jabuzz ( 182671 )

        One imagines that the iPhone7 or more likely the iPhone8, as one suspects the iPhone7 design is too far advanced to add this will come with such a feature.

        Quite what the FBI are thinking I don't know.

      • It is built in, now. The iPhone 5C is the last model without the "Secure Enclave" which stores the key, arbitrates all access to it, and wipes it after 10 attempts unless told not to.

  • by xxxJonBoyxxx ( 565205 ) on Wednesday February 17, 2016 @10:03AM (#51526909)

    >> "Can courts compel Facebook to provide analytics of who might be a criminal?...Or Google to give a list of names of people who searched for the term ISIS?

    Facebook already publishes a guide for law enforcement: https://www.facebook.com/safet... [facebook.com]
    Google does too: https://www.google.com/transpa... [google.com]

  • by lazarus ( 2879 ) on Wednesday February 17, 2016 @10:06AM (#51526923) Journal

    'The All Writs Act [wikipedia.org] is a United States federal statute, codified at 28 U.S.C. 1651, which authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.'

    "On October 31, 2014, the act was used by the U.S. Attorney's Office in New York to compel an unnamed smartphone manufacturer to bypass the lock screen of a smartphone allegedly involved in a credit card fraud."

    Looks like there is a precedent. Mind you Apple has lots of money for lawyers to make sure this doesn't happen.

  • Thanks Apple (Score:5, Insightful)

    by Anonymous Coward on Wednesday February 17, 2016 @10:07AM (#51526929)

    I'm not an iPhone user but I appreciate you standing up for people's privacy. I have a better chance of winning the lottery than dieing at the hands of a terrorist. Why would I want to lose my privacy over those odds.

    • by firewood ( 41230 )

      I have a better chance of winning the lottery than dieing at the hands of a terrorist. Why would I want to lose my privacy over those odds.

      One of those odds is an individual risk. The other is a systemic risk that can also end up in the death of many of your family, friends, coworkers and bystanders. So relating the two odds sounds statistically intelligent, but is really a form of systemic risk blindness.

      • In that case, the risk of you or your family dying in a motor vehicle accident--the same kind of systemic risk that you're talking about--is orders of magnitude higher than the risk of dying in a terrorist attack. Similarly with dying from domestic gun violence. That's with laws in place to mitigate those risks. The reality is that terrorism is a really marginal cause of death, and to spend too much time and money worrying about it (certainly any time and money beyond the enormous amounts we're already spen

  • by Anonymous Coward on Wednesday February 17, 2016 @10:20AM (#51527003)

    If I read Apple's "customer letter" correctly, they very well have the ability to create the software that is demanded of them, and decrypt that phone. Whether that software already exists or not is immaterial. If it is possible to create the software and use it on existing devices, then for all intents and purposes the backdoor is already there. Apple just doesn't want to open it, because they rightly fear losing the trust of their customers - trust which, following this interpretation, is unfounded.

  • I have a basic question. The phone in question is encrypted. How is Apple supposed to un-encrypt it without the key, which they state they do not have? Apple goes on to say, that for the future, they would have to create a back door program to do that which is an entirely different discussion. So, unless the backdoor already exists (and Apple wants to keep it secret), how can this "request" be fulfilled?

    • Answer: the court order is for Apple to write such a backdoor program.
      • Answer: the court order is for Apple to write such a backdoor program.

        Yes, I know, but such a program won't work retroactively to unencrypt the already encrypted data because Apple doesn't have the key. It would only work going forward. As such, the government's appeal to the courts is bogus. The back door still won't get them the unencrypted data.

        Put differently, the government already has the data from the phone in question. To access the data in unencrypted form, there is nothing that Apple can do that they, themselves cannot do. If Apple acquiesced, then for future case

  • A lot of news and commentary on this one.
    https://www.washingtonpost.com... [washingtonpost.com]
    http://www.usatoday.com/story/... [usatoday.com]
    http://www.nbcnews.com/storyli... [nbcnews.com]

  • There is something that does not add up in Apple's discourse at http://www.apple.com/customer-... [apple.com]

    Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

    The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor.

    I read what the FBI asks as: install a piece of code that allows the phone's content to be examined. I see no middle ground between

    1) running such piece of code (probably: after getting it signed by Apple) is possible without the owner's passcode; the iPhone is in fact already backdoored, with Apple holding the key, the FBI wants Apple to exploit the vulnerability/open the backdoor, and Apple does not want to b

  • 1. In this case, we have a lawful order from a court. Legally and morally, Apple should comply.
    2. Seems to me the SDK would include a way to dump the contents of a phone and a simulator. So it's just a simple script to spin up a virtual image, enter a passcode and repeat until the phone unlocks. Are you telling me the Apple developer's kit doesn't have that functionality? I bet the NSA developer's kit does.
  • by gestalt_n_pepper ( 991155 ) on Wednesday February 17, 2016 @10:58AM (#51527273)

    There Is No Such Thing as Magic. If there is a known backdoor, it will be found and exploited. This can't be prevented, and honestly (Take not, politicians)...

    That means that the content on anyone's phone can be stolen. Not just anyone's phone, but the phone of every politician in the world.

    Be careful what you wish for.

    • That means that the content on anyone's phone can be stolen. Not just anyone's phone, but the phone of every politician in the world.

      If politicians want to put a back door on our phones, those politicians need to use those same phones. [csmonitor.com]

      Which brings up another point. The US Federal Government can hack the German Chancellor's Iphone, [bbc.com] but not the Iphone of some nut in California?

      • by Kjella ( 173770 )

        Which brings up another point. The US Federal Government can hack the German Chancellor's Iphone, but not the Iphone of some nut in California?

        Presumably the Chancellor's phone was hacked while it was running. Just like using full disk encryption doesn't protect you from getting a trojan while surfing, but if your laptop is turned off and gets stolen it will. It's only a strong lock on the door, it doesn't prevent nasty things from slipping in when it's open.

  • by DutchUncle ( 826473 ) on Wednesday February 17, 2016 @11:32AM (#51527491)
    This particular phone's owner deserves no mercy. But that's not the point, or at least not the whole point. If Apple can do this to one phone, they can do it to any phone; and if the government can make Apple do it to the phone of a dead murderer who doesn't deserve legal protection, then the government can make Apple do it do it to the phone of a live whistleblower who DOES deserve legal protection. My title comes from an era of free speech rights debates inspired by porn cases; the fact that a particular image is disgusting, like the fact that a particular case involves a murderer, does not justify changing our checks and balances for "just this case", because the precedent will be used to justify many more cases.
    • They can't do it to my iPhone, or any iPhone Apple is currently selling. Mine is the version after the one in the case, and is not vulnerable to that attack. It's probably vulnerable to something else, being a complex piece of technology, but that's how life works.

  • The Apple post seems to make it clear that this attack is doable. This implies that the San Bernadino phone is set to gobble up an update signed by Apple or something. This sounds like Apple is fully capable of creating a signed malicious operating system into the phone, and is now refusing to do so. Nothing in the article implies that this is *impossible*, merely *really unwise*.

    Of course it is unwise. But the fact that it is POSSIBLE belies a second security flaw- that installing a new OS is possible

  • by BrendaEM ( 871664 ) on Wednesday February 17, 2016 @12:25PM (#51527837) Homepage

    Well, one Apple-Hater hates a little less.

  • by cfalcon ( 779563 ) on Wednesday February 17, 2016 @12:28PM (#51527855)

    The order implies that Apple is capable of delivering a remote update, or that forcing an update locally is possible if you have physical access. It also implies that portions of the security models are enforced by software that is vulnerable to "update", such as the wipe-after-ten-tries (presumably that code will be replaced with a no-op) and the code entry delay in excess of that which is enforced by hardware.

    Whether Apple is compelled to do this or not, the natural concern is "well how much of my data is shielded by math, how much by hardware, and how much by software"?

    You can't bargain with math, you have a devil of a time working out hardware, and software along is meaningless as a defense.

    It appears that your best bet for security is either:

    1)- A multi-character password that is easy to enter (and you'll remember it if its your phone password, lol), but reasonably short. This is if you trust that the 80ms hardware delay can't be broken. This precludes the use of 4 and 6 digit PINs, as a 4 digit PIN will usually fall after a few minutes of this treatment, and a 6 digit PIN after around half a day. An 8 digit password consisting of a completely random set of just the visible lowercase letters (aka, no actual english words) at this rate is hundreds of years, and adding stuff that's harder to enter quickly (capitals, numbers, special characters) makes it much more secure, as does lengthening the password slightly. The challenge here is that passwords are usually chosen to be words, greatly reducing the entropy. And again, this assumes that the 80ms hardware delay is not defeatable.

    2)- A fully secure crypto passhprase. This is the level of drama you would go through to password protect a drive or something you take very seriously, and as such it would be a lot more than 8 characters. Your passphrase is long, contains several unpredictable parts, and makes use of more than just a statistically predictable subset of words and characters. You can set this on the iphone, of course, but this kind of protection is not trivial to type in. In this case, you are trusting the math only, however, and assuming that the software will be compelled by the government, and the hardware will be owned by a team skilled in this matter.

    Going forward, Apple should probably move the "erase after 10 tries" into the secure portion of the phone, such that it has a protected portion that can't be overwritten without access to the PIN. This will also make them immune to this sort of order in the future.

    • by eth1 ( 94901 )

      or

      3) A two-level system. A simple PIN that unlocks the phone enough to make calls, send texts, take pictures, etc., and provide access to the last X hours of call/text/pic/etc. history. Going back any further in time would require a more complex passphrase. Apps could be grouped into ones usable with PIN, and ones only usable with the passphrase. A panic button on the lock screen would flush all data to the more protected area, and require the more complex passphrase to unlock anything.

  • by KlomDark ( 6370 ) on Wednesday February 17, 2016 @02:34PM (#51528935) Homepage Journal

    Apple is in the right and Google and Microsoft should be backing Apple on this one.

    And I say this as a long time hater of all things Apple. I own nothing from Apple. Never have. I fucking detest Apple. Look at any post of mine here on /. for almost 20 years and you will see that this is the very first post I have ever made here saying ANYTHING positive about Apple.

    But they are so in the right on this one. Our devices are our own personal data archives on a level way beyond our houses or safes. It is an extension of our brains. While we may not have the technology just yet, imagine if a court ordered you to have your personal memories residing in your physical brain extracted. This is the same thing. People put info on their phones with a full expectation of privacy, regardless of if that expectation is realistic. If you think it's acceptable for the government to demand access to your most personal inner sanctum of being (your brain), then a brain extension like a personal phone or computer should also be inviolate.

    This is so completely disgustingly wrong of anyone in government to expect such a thing, for any reason. Even for this reason. Extra ludicrosity because this is an after-the-fact demand, no matter what happens this will not bring one person back from the dead.

    If you are about freedom in any way, then you should be completely against this horrid precedent ever being set. Your thoughts are your own, always.

  • by HeckRuler ( 1369601 ) on Wednesday February 17, 2016 @02:42PM (#51529019)

    I presume that some congressman pushed the FBI to make this request out in the open just for the purpose of fighting it in court. All in all it's a good thing. Defending civil rights and all that.

    But if the FBI ACTUALLY wanted this information they would have simply given Apple a gag order along with it. Or asked the NSA to do that for them. It's even their purpose, fighting terrorism, right? This falls SQUARELY under the domain of shit they've strong-armed and gagged companys into helping them with. The fact that we're even hearing about it has to be some sort of process manipulation.

You need tender loving care once a week - so that I can slap you into shape. - Ellyn Mustard

Working...