Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Communications Privacy Your Rights Online Apple

Apple Will No Longer Unlock Most iPhones, iPads For Police 504

A reader writes with this selection from a story at the Washington Post: Apple said Wednesday night that it is making it impossible for the company to turn over data from most iPhones or iPads to police — even when they have a search warrant — taking a hard new line as tech companies attempt to blunt allegations that they have too readily participated in government efforts to collect user data. The move, announced with the publication of a new privacy policy tied to the release of Apple's latest mobile operating system, iOS 8, amounts to an engineering solution to a legal dilemma: Rather than comply with binding court orders, Apple has reworked its latest encryption in a way that makes it almost impossible for the company – or anyone else but the device's owner – to gain access to the vast troves of user data typically stored on smartphones or tablet computers. The key is the encryption that Apple mobile devices automatically put in place when a user selects a passcode, making it difficult for anyone who lacks that passcode to access the information within, including photos, e-mails, recordings or other documents. Apple once kept possession of encryption keys that unlocked devices for legally binding police requests, but will no longer do so for iOS8, it said in a new guide for law enforcement. "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data," Apple said on its Web site. "So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."
This discussion has been archived. No new comments can be posted.

Apple Will No Longer Unlock Most iPhones, iPads For Police

Comments Filter:
  • by Elad Alon ( 835764 ) on Thursday September 18, 2014 @12:02PM (#47937629)

    So everything is protected by a 4 digit passcode?
    Wow... Impregnable.

    • by Spy Handler ( 822350 ) on Thursday September 18, 2014 @12:04PM (#47937657) Homepage Journal

      My luggage only has a 3 digit passcode, iphone is 10 times stronger encrypted!

    • by binarylarry ( 1338699 ) on Thursday September 18, 2014 @12:05PM (#47937671)

      It could be a 4096-bit private key with uberultra fugu-based quantum encryption:

      http://xkcd.com/538/ [xkcd.com]

    • Re: (Score:2, Informative)

      by Anonymous Coward

      No, you can, and should, use a much longer (and with more varied characters) passcode than that on iOS. The device actively tells you you should if you set up touch ID.

    • Yes, but you can easily set your device to wipe after 10 incorrect passcode entries. So, what this really means (assuming that Apple's statements are true) is that, in the event the police want access to your iDevice, their only option (unless they're willing to play 1000:1 odds) is to get the passcode from you.

      • by alen ( 225700 ) on Thursday September 18, 2014 @12:15PM (#47937789)

        i'm sure the cops can image your encrypted phone and try to break the encryption offline without risking loss of data. if they can't break it now, they will simply store the data for the next 10 years until they can and go back to it then. sort of like fingerprints, DNA or any other crime scene evidence

        • by Anonymous Coward on Thursday September 18, 2014 @12:22PM (#47937883)

          No because encryption is derived from passcode and device key which is in the cryptochip sillicon. You have to brute force those things 'online' due to this as anyone who has done iOS forensic will tell you. Now if you want to break that full key out of the blue offline then... hm. yeah.. see you in a million years.

          • by alen ( 225700 ) on Thursday September 18, 2014 @01:04PM (#47938387)

            some of us are old enough to remember when 128 bit keys were considered unbreakable

        • Comment removed (Score:5, Insightful)

          by account_deleted ( 4530225 ) on Thursday September 18, 2014 @12:32PM (#47937991)
          Comment removed based on user account deletion
          • Or, they simply use a $5 wrench [xkcd.com].

            If they simply want the information, the $5 wrench works. If they want it to be admissible in court, then it doesn't work so well.

          • by chispito ( 1870390 ) on Thursday September 18, 2014 @02:29PM (#47939231)

            Or, they simply use a $5 wrench [xkcd.com].

            Don't be ridiculous, we're talking about the US government and not some thugs.

            It would be a $5,000 wrench.

        • i'm sure the cops can image your encrypted phone and try to break the encryption offline without risking loss of data. if they can't break it now, they will simply store the data for the next 10 years until they can and go back to it then. sort of like fingerprints, DNA or any other crime scene evidence

          For that they would not need to crack a password, but create 256 bit encryption. With different encryption keys for every single file in the file system. I think brute forcing 256 bit encryption unless severely flawed is at the "physically impossible" level.

      • Standard data forensics procedure is to write-protect any storage device which contains evidence, copy it bit-for-bit, and do all the decrypting and data analysis from the copy. The 10-try limit may protect your data from a random thief who lifts your phone, but the only way it's going to protect you from the government or any other technically-capable hacker is if Apple baked the limit into the flash memory-reading hardware.

        And there's always this [xkcd.com].
        • One would think so, but they may also just want to activate the wipe to intentionally delete the data on the phone that could exonerate you!

          http://justiceforbradcooper.wo... [wordpress.com]

        • by Savage-Rabbit ( 308260 ) on Thursday September 18, 2014 @12:50PM (#47938201)

          Standard data forensics procedure is to write-protect any storage device which contains evidence, copy it bit-for-bit, and do all the decrypting and data analysis from the copy. The 10-try limit may protect your data from a random thief who lifts your phone, but the only way it's going to protect you from the government or any other technically-capable hacker is if Apple baked the limit into the flash memory-reading hardware.

          And there's always this [xkcd.com].

          You can put a complex password on your iPhone:

          1) Settings->Passcode, enter your 4 digit passcode.
          2) Flip the "Simple Passcode" switch.
          3) Set your new arbitrary length complex password.
          4) Enable the "Erase Data" setting which wipes the device after 10 incorrect password inputs.
          5) Enjoy entering your complex password every time you want to access the phone.

          The encryption on these iDevices and the Macs is non trivial to crack. Combine this encryption with a properly strong password and that wipe feature and even the Police would be shit out of luck. I know of a case where a guy resolutely refused to provide police with the password and crypto-key for his MacBook. The cops shipped the laptop to Cupertino who sent it back after a few weeks having failed to crack the drive encryption. The cracking would take longer than the expected lifespan of the universe. Your only hope of getting into a properly password protected and encrypted device be it an iDevice, an Android device or a Windows phone is if there happens to be some software vulnerability that enables you to bypass the login screen.

          • The cracking would take longer than the expected lifespan of the universe.

            The obvious solution is inter parallel universe travel. We find the parallel universe in which the only difference is that the suspect didn't lock his/her phone and get the data there. Problem solved.

          • You don't have to enter the passcode every time if you've got a TouchID device. When my new phone shows up, I have a 13-digit code memorized from when I was a kid (long story). I'll input that once a day, and use the scanner to unlock the device the rest of the time.

            Really you only need a 6-digit passcode to be exceptionally safe, but it's honestly easier for me to remember this particular code than something shorter.

          • by sl149q ( 1537343 )

            Yes I'm sure that anybody who doesn't want their data to be read by the authorities won't be able to afford to buy an iPhone with TouchID.

            5) Enjoy entering your complex password every time you want to access the phone.

        • Too bad for "standard forensics" that the passcode is mixed in with a hardware-specific key baked into the SOC [apple.com]. So you'll first need to be able to run arbitrary code on the individual's phone itself in order to keep guessing beyond the limit. That's going to require a significantly more intrusive examination.

    • Comment removed based on user account deletion
    • by Tuidjy ( 321055 )

      I have one question:

      If you damage your iDevice and forget your password, can they recover your data?

      If the answer is yes, they are lying if they say they cannot assist law enforcement. And between lying to their consumers and lying to the government... I am pretty sure I know which way they will lean.

      • If you damage your iDevice and forget your password, can they recover your data?

        If you forget your password, and you lost the backup key that Apple tells you to put in a save place when encryption is turned on, and you forget the answer to your security question, then yes, your data is gone. Forever.

        • by LordLimecat ( 1103839 ) on Thursday September 18, 2014 @12:52PM (#47938225)

          and you forget the answer to your security question,

          The presence of a security question on any service indicates immediately that they almost certainly have access if served with a warrant.

          • by Altrag ( 195300 )

            That works for basic access passwords since the only check is "is it right yes/no?" at one particular entry point (the login screen.) You can reset that password and they only have to "update" the one location (their password hash file.)

            Encryption is a whole different beast as you're effectively password protecting every single byte on your device. Simply changing the access password won't change those bytes.

            So unless they're storing your password in plaintext (or reversibly encrypted,) or they've built a

    • by mark-t ( 151149 )
      If the passcode must be manually entered, then even a 4 digit password is not particulalry feasible to try and crack. Sure, it can be done by a determined enough person, but you're talking about sitting around doing nothing but pressing virtual keys on a screen for what on average would probably be at least half a dozen hours before they might luck out and get it right. Most people have something considerably better to do with their time... If that couples with a password count restriction, say, limit
    • by Lumpy ( 12016 )

      only if you are retarded and only use a 4 digit code.

    • by alta ( 1263 )

      The after the first fail it times-out for 5 minutes, then 30, then 60, 24 hours, etc... This is an example, I'm not sure of the actual times. But I've seen it happen.

      If you get to a dozen tries you're about a month into it.

    • So everything is protected by a 4 digit passcode?

      Or your fingerprint, and where would the police get your fingerprint?

      • by Aaden42 ( 198257 ) on Thursday September 18, 2014 @03:45PM (#47940003) Homepage

        If you believe you may soon be under arrest, power off or hard reset (hold power & home) your device.

        Only the OS itself is accessible immediately after reboot. All user-level flash is secured with a different key than the OS, and that key is secured by your passphrase, not your TouchID. That’s why you need to enter your password every time you reboot & can’t TouchID unlock until you do. If you reset your phone, the cops can hold it against your thumb all day long, and it won’t do them any good.

        For an in depth discussion of how the crypto in iOS is implemented, see:
        http://www.apple.com/ipad/busi... [apple.com]

  • Sanity... (Score:5, Insightful)

    by Panaflex ( 13191 ) <`convivialdingo' `at' `yahoo.com'> on Thursday September 18, 2014 @12:03PM (#47937649)

    This is how things are supposed to be. The legal system was designed for individuals "to be secure in their persons, houses, papers, and effects."

    • No it isn't. Law enforcement is supposed to be able to obtain relevant information with a search warrant.

    • by LWATCDR ( 28044 )

      That is why a court order is needed. So now all the big companies will use the same method on all of their communications.
      Of course the court could just order you to turn over your password.

      • Re:Sanity... (Score:5, Insightful)

        by FlyHelicopters ( 1540845 ) on Thursday September 18, 2014 @01:34PM (#47938737)

        Of course the court could just order you to turn over your password.

        Sure, but you also have the right to refuse... They can charge you with contempt perhaps, but you at least have that choice.

        Before, you didn't even have that much of a choice...

    • This is how things are supposed to be. The legal system was designed for individuals "to be secure in their persons, houses, papers, and effects."

      Like many countries, we inherited a strange and somewhat muddled legal system from England. That bit got added along the way.

    • Yea... I normally am rather unhappy with much of what Apple does...

      But I'll give credit where it is due, this is clearly a customer friendly policy and I'll applaud them for it...

      It improves Apple's imagine in my view and anyone who has privacy concerns should give Apple another look.

  • Then they're served with another warrant ... one that obliges them to put a back door into either the individual device, or their whole infrastructure. Without informing users that such a warrant has been served.

    Then what?

    It's like a game of chess where the values of the piece can be unilaterally changed by one side.

    • You have several choices here. 1) Sue, claiming that the government can neither force you to design your software differently, nor require them to sell a specific software. It is a very different thing than the existing system where people are required to provide access to existing backdoors, rather than being required to create backdoors.

      2) Move your corporate headquarters off shore. Then tell them they have no authority over you, but if they want to sue your country or publicly demand you stop selli

    • As much as the government wants these powers, and wants them in secret, this is really a policy decision which needs to be made by the citizens. They need to decide if off-the-shelf products should provide end-level encryption by default, or if the government should always have a back-door into all encrypted products. Not politicians, not the police, and certainly not secret government courts. Society at large needs to decide which is more important - personal privacy, or the government's ability to obta
    • Then you hire a lawyer because no court in the U.S. has the authority to order a specific change to a product. The most they can do is declare a product to be unlawful as shipped, and that is done very publicly.

      • > no court in the U.S. has the authority to order a specific change to a product.

        Not that they'd actually order that a backdoor be developed, but most courts can order specific performance. In many states, small claims courts are limited to monetary damages, but any other court of general jurisdiction can issue a specific performance order. You see this used in custody cases where the father is ordered to provide health insurance, for example. It's also common to have specific performance ordering a

    • by xdor ( 1218206 ) on Thursday September 18, 2014 @12:37PM (#47938049)

      They don't need a back-door.

      Sure they'll encrypt your files with a key they don't know just like they said. But to comply with law enforcement all they would have to do is intercept your password when you enter it. And that's done easily : keyboard driver update patch for target users: collects and forwards the password to the feds.

      That way they're still encrypted as advertised. And its possible that if you lose your phone or its confiscated that this would still be a plus. But I think this password intercept is how the feds would get access if they're monitoring you specifically.

    • I presume you wouldn't say it was "wrong" of the United States to crack the German and Japanese codes in WWII...

      ...so when US adversaries (and lets just caveat this by saying people YOU, personally, agree are legitimate US adversaries) don't use their own "codes", but instead share the same systems, networks, services, devices, cloud providers, operating systems, encryption schemes, and so on, that Americans and much of the rest of the world uses, would you suggest that they should be off limits?

      This isn't

      • by gnasher719 ( 869701 ) on Thursday September 18, 2014 @12:45PM (#47938131)

        I presume you wouldn't say it was "wrong" of the United States to crack the German and Japanese codes in WWII...

        Aren't you rewriting history a little bit there? The USA didn't crack German codes. That was a bunch of Polish mathematicians, followed by British mathematicians and engineers. And when Americans make movies, three British sailors of whom two died getting secret materials out of a sinking German U-Boot suddenly become Americans!

  • I'm operating under the assumption that this is not some marketing gimmick and Apple is really concerned about the privacy of its user base. If this is the case, I'm happy to keep using Apple. I was going to hold off on updating to iOS 8 but, methinks I will upgrade after all. The old adage, "You have nothing to worry about if you haven't done anything wrong" is just plain wrong. With the misconduct of police and prosecutors, I want to be secure in my effects.
  • Comment removed based on user account deletion
  • "So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."

    Let Apple relax for it's a matter of time. As any software developer knows, software will [always] have bugs. Apple's software is no different.

  • by Vapula ( 14703 ) on Thursday September 18, 2014 @12:20PM (#47937865)

    Nothing prevents you to use 3rd party encryption on your Android phone (and I'm not speaking about 3rd party system)... and I seriously doubt that Google will be able to do anything about data crypted by 3rd party system.

    On Android, you work on a system of service provider/consumer. Your contact list ? you've an application acting as contact provider and other as contact consumer (reader/writers).... If you want to protect them, nothing prevents you to use a different default contact provider which uses an encrypted container. Same for most of the phone features...

    On iPhone, you can only trust Apple's word... like we did when it was about geolocation data...

    • by mlts ( 1038732 ) on Thursday September 18, 2014 @01:02PM (#47938363)

      On Android, you can use dm-crypt to encrypt your /data partition with a passphrase of a real length, which is separate from your screen unlock PIN/password.

      You do need to root it, and type in a command similar to this:

      vdc cryptfs changepw newpass

      or to enable encryption via the command line:

      vdc cryptfs enablecrypto inplace

      With /data encrypted, it will prompt for the long passphrase at boot, then from there on, just need the short screen locker password.

      I like this part of Android -- you can easily pack your own parachute when it comes to encryption.

  • I suspect Apple is making this news release because of secret court order asking them to do. It is the same as law enforcement saying imessage is uncrackable. All they (the government) want is for you to voluntary turning over your information thinking that they are secured.
  • We'll see (Score:5, Interesting)

    by Barlo_Mung_42 ( 411228 ) on Thursday September 18, 2014 @12:55PM (#47938279) Homepage

    Blackberry used to be secure until they wanted to sell phones in India and the Indian government demanded a backdoor in order for them to sell phones there.
    Will India now also refuse the sale of iOS8?

  • "Most"? (Score:5, Interesting)

    by Squidlips ( 1206004 ) on Thursday September 18, 2014 @01:51PM (#47938893)
    What does "Most" mean? Sounds like another loophole..

"The pyramid is opening!" "Which one?" "The one with the ever-widening hole in it!" -- The Firesign Theatre

Working...