Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Crime Encryption Iphone Privacy Security News Your Rights Online Apple Technology

Godfather Of Encryption Explains Why Apple Should Help The FBI (bgr.com) 293

An anonymous reader writes: Famed cryptographer and Turing Award winner, Adi Shamir, has an interesting if not surprising take on Apple's current legal tussle with the FBI. While speaking on a panel at RSA Conference 2016 earlier this week, the man who helped co-invent the vaunted RSA algorithm (he's the 'S' in RSA) explained why he sides with the FBI as it pertains to the San Bernardino shooter's locked iPhone. It has nothing to do with placing trapdoors on millions of phones around the world," Shamir explained. "This is a case where it's clear those people are guilty. They are dead; their constitutional rights are not involved. This is a major crime where 14 people were killed. The phone is intact. All of this aligns in favor of the FBI." Shamir continued, "even though Apple has helped in countless cases, they decided not to comply this time. My advice is that they comply this time and wait for a better test case to fight where the case is not so clearly in favor of the FBI."
This discussion has been archived. No new comments can be posted.

Godfather Of Encryption Explains Why Apple Should Help The FBI

Comments Filter:
  • What a crock (Score:5, Insightful)

    by zieroh ( 307208 ) on Thursday March 03, 2016 @09:48PM (#51633719)

    What a crock full of shit.

    • Re:What a crock (Score:5, Insightful)

      by Anonymous Coward on Thursday March 03, 2016 @09:57PM (#51633775)

      And from someone who really ought to know better.

      What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.

      Waiting for a future, better case would sure end up looking foolish when the government argues, "What's the problem? You agreed to do this exact same thing before, in the San Bernardino case..."

      • by Anonymous Coward on Thursday March 03, 2016 @11:26PM (#51634265)

        That's only one opinion. Ron Rivest, the R in RSA, disagrees with Shamir over this one. As the S in the abbreviation, Shamir is the man in the middle.

        • Re: (Score:2, Funny)

          by Anonymous Coward

          That's only one opinion. Ron Rivest, the R in RSA, disagrees with Shamir over this one. As the S in the abbreviation, Shamir is the man in the middle.

          So you're saying that this is a man-in-the-middle attack?

      • Re:What a crock (Score:5, Insightful)

        by Shoten ( 260439 ) on Thursday March 03, 2016 @11:34PM (#51634319)

        And from someone who really ought to know better.

        What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.

        Waiting for a future, better case would sure end up looking foolish when the government argues, "What's the problem? You agreed to do this exact same thing before, in the San Bernardino case..."

        Should he know better? I'm not sure. On one hand, Shamir is really good at math. But math has almost nothing to do with Constitutional law, which is what this is really about.

        There's a big difference between who can create/build a certain technology, and who should be trusted with knowing how and when to use it. A lot of people conflate the two, but they are incredibly different skillsets.

        • Re: What a crock (Score:3, Insightful)

          by Anonymous Coward

          More importantly, knowledge of math has nothing to do with right and wrong.

          • But the math does say you can build a secure phone where only the owner has the key. Ask anyone who's ever lost a bitcoin wallet. Apple chose to retain that key making it a political issues, not a technical one.

            • Re: (Score:3, Insightful)

              by tburkhol ( 121842 )

              But the math does say you can build a secure phone where only the owner has the key.

              No, the math says the phone is secure if only the owner has the key.

              Apple chose to retain that key making it a political issues, not a technical one.

              Again, not quite. Apple wrote the operating system that allows the owner sole access to the key, and they can rewrite the OS to violate that exclusivity. Whether they can be forced to retroactively modify their OS to expose their customers' private data is the political issue.

        • Should he know better? I'm not sure. On one hand, Shamir is really good at math. But math has almost nothing to do with Constitutional law...

          This involves primarily two things; philosophical identity, and equals/not equals. Both of those are taught to mathematicians. Identity is simply the concept that a thing exists. We're talking about an abstract thing, specifically a set of actions that together are the "help" that the FBI wants from Apple. Then, we have equals/not equals. How many times has "help" that the FBI asks for been done? None.

          It is exactly counting that he no longer comprehends. I recommend that he get a brain scan to check for tum

        • It is about more than just laws, principles and precedent. It really is about operational security, as well as the kind of oversight (or complete lack thereof) we have on the application of cyber-surveillance by government institutions. As Apple have said:

          "The notion that this is something only about opening one lock or that there is some degree of locks that can't be opened with the tool that they're asking us to create, is a misnomer," Sewell added.

          Apple evidently worries that the FBI will keep whatever tool Apple creates, and use it to break into other iPhones in other cases. Those cases are most likely not always legit, and there is a further risk of the tool getting into the hands of others. At

        • by c ( 8461 )

          On one hand, Shamir is really good at math. But math has almost nothing to do with Constitutional law, which is what this is really about.

          US Constitutional law, specifically. Shamir is an Israeli, so it's natural to expect that he's going to balance the rights of individual Americans versus the state a wee bit differently than those who have to live with the consequences.

        • It's not just about Constitutional law, but also about security principles. My sense of his argument is that he's saying, "You may want to put up a fight some other time, but in this instance, you know that they're guilty, so you should do what the government is asking." In other words, "Security is important, but in this one instance the person should probably be authorized, so build a new backdoor into the system for that person that allows them to circumvent your security protocols."

          If this guy is any

      • Re:What a crock (Score:5, Insightful)

        by Solandri ( 704621 ) on Friday March 04, 2016 @01:04AM (#51634647)
        Yeah, because MGM vs. Grokster [wikipedia.org] and Capitol Records vs. Thomas-Rasset [wikipedia.org] did so much to advance the cause of filesharing rights vs IP law.

        You want a good case which burns away irrelevant fluff and centers the issue solely on the principles at stake. If you pick a bad case, the court could decide against you based on that fluff, inadvertently setting a precedent which influences the balance of those principles. The principles at stake in this case (or what people are hoping this case will be about) are an individual's right to privacy and a company's right of self-determination against government coercion, vs. the government's duty to keep society safe. Here are the flaws I can see with this case:
        • Privacy rights and the 4th Amendment aren't relevant. It's not the shooter's phone. The phone belongs to the San Bernardino County government.
        • The shooter was indisputably guilty of the crime. You want the test case to highlight how the power the government is asking for could be abused, not one which validates the government's argument. That way the rights violation is real while the benefit the government is arguing for is hypothetical. Not the other way around.
        • Even if you argue that the shooter's privacy is somehow relevant, he's dead. It's questionable if or to what extent privacy rights survive after your death. If we're going to have a test case about privacy rights after death, I'd rather it be of an innocent guy wrongly accused by the government and his reputation consequently smeared. Not some guy who was indisputably guilty.
        • The shooter was a terrorist, and his victims were innocent. I wish this weren't a factor, but it is. The best way to get a guy off a murder charge is to convince the jury that the victim deserved to die. The polls showing a slim majority of Americans supporting the FBI in this wouldn't be coming out that way if this were the FBI asking Apple to help it break into some grandmother's phone because she might have poisoned an axe murderer who hacked her grandkids to bits.
        • The strongest argument supporting Apple in this case is that the government cannot coerce an individual or company to do something against their will. Well, the exceptions to that in general law are pretty much all tied to a state of war or national emergencies. People can be drafted into military service. Stores' inventories can be confiscated for redistribution as the government sees fit. This being a terrorism case comes uncomfortably close to meeting that criterion.

        Waiting for a future, better case would sure end up looking foolish when the government argues, "What's the problem? You agreed to do this exact same thing before, in the San Bernardino case..."

        All the points I listed above can be used to refute that argument. That's why this is a bad case. Heck, even the recent New York case (defendant is a drug dealer, but he is the phone's owner, and he didn't kill anyone) is a better case.

        • Let's first see evidence that the FBI and San Bernardino County didn't deliberately destroy or hide evidence to force the whole issue. There are reports that SBC paid for Mobile Device Management software, but claims that they didn't put it on their phones because supposedly the users could just remove it (which sounds bogus to begin with). With MDM software it would have been extremely easy to get at all the information on the phone.

          Anybody willing to blow a whistle here?

      • What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.

        My guess he realizes Apples argument is nonsensical gibberish.

        My data is only secure so long as someone does not create a tool after the fact to compromise it... What kind of bullshit is that? Why is this technically ridiculous position worth defending politically in front of millions of people scared out of their minds of terrorists and whom do not know jack about underlying technology?

        The FBI is laughing their assess off right now at the stupidity of the tech industry. Apple is succeeding in re-railing

        • by AmiMoJo ( 196126 )

          Apple doesn't want there to be a precedent set that says the forced creation of software to crack their own phones is not an undue burden. If they lose then any future security improvements they make won't get them off the hook, the court will just say "you did this to yourself, now expend the necessary resources to undo it".

          It's a shame they didn't do it properly in the first place so that their claim of not being able to unlock the device was actually true. Then there wouldn't be a risk of this happening

        • Re:What a crock (Score:5, Insightful)

          by Shawn Willden ( 2914343 ) on Friday March 04, 2016 @08:56AM (#51635769)

          To wit the answer must be: "We re-architected our security hardware and no longer possess the *capability* to circumvent query limit and delay."

          They probably already did this in the 5S and later. The 5C does not have the "secure enclave" chip, which means there is no secure hardware on the device, all of the security is implemented in software.

          Also, it should be pointed out that the signed software approach is quite secure against anyone who doesn't have the software signing keys, or the capability to compel the holder of the signing keys. That's actually just fine if the goal is to defend the data from access by private snoops (e.g. spouses, even technically sophisticated ones, corporate espionage, etc.) and criminals.

          Consumer devices will never be secure against state actors with unlimited funds and sufficient motivation (e.g. NSA, GCHQ, Mossad, etc.), so really the only cases where the approach is inadequate are cases where (a) the owner of the signing keys (Apple) wants the data or (b) a government with the power to compel the owner of the signing keys wants the data. Apple has no reason to prevent their own ability to circumvent (though they do need to protect against use of this ability by rogue employees), since they know their financial interest is strongly on the side of securing the data, and legitimate companies generally do not expend effort on securing data against law enforcement officials of democratic regimes that have due process and rule of law.

          Indeed, although the 5S and the 6-series probably do make it impossible for Apple to comply with similar demands for those devices, I really doubt that was the rationale for adding the security chip. I'm the lead engineer for similar components on Android, and while I've been pushing to include a secure element chip for some time, the rationale has never been to keep US courts from being able to compel access, it's always been about strengthening security against corporate espionage (which signed software solutions do address, but not completely) and to make penetration costlier for oppressive governments and intelligence services. I say "costlier" since they can't really be kept out completely.

          I'm not certain that the secure enclave actually keeps Apple from being able to comply with this sort of request, either. I expect that the software in that device is also field-upgradable, since there are compelling practical and security reasons for enabling upgrades. Bugs are always a risk, and being able to fix them is a really good thing. But if the software can be upgraded, then it can be "upgraded" to remove security features. This can be limited in various ways; it's common, for example, to have secret keys burned into hardware which simply cannot be extracted by software because the software never has direct access to them, and any security that derives from the secrecy of such keys can't be subverted by software changes. But brute force mitigation possibly can be upgraded away, even with the secure enclave chip.

          The bottom line here is that these are really hard engineering problems. Not that it's hard to design so that key components are non-updatable... that's easy. But it's also very risky, because it leaves you without any options when said components turn out to have problems. I think it's flat wrong to characterize Apple engineers' failure to secure the 5C against Apple as any kind of incompetence, which is your clear implication.

      • My gut reaction is to agree with you. But this is a constitutional issue, a legal issue. This may not be the incident to build a case on. Although - if we win on a case like this - where there is no doubt about the guilt and that the individuals are dead; then it would be close to an iron-clad precedent.

        I can see the point he's making - that this particular case is not the best one put all your chips on.
    • His contributions to encryption and privacy can not be underestimated, but it seems everyone has their faux pas moments. The implications with this approach are a bit similar to those of torture; it may be used "just this once", but then again, you can only sell once the core values of a democratic society.

    • by Greyfox ( 87712 )
      Perhaps you'd rather listen to Darrel Issa take NPR's David Green [npr.org] to school on the subject?
    • So you didn't just fall off a potato wagon, either? Amazing coincidence, that. :)

      It's not even "wrong"--it's *disingenuous* to a degree which resulted in near-instant coffee spatters on my monitor and a "HORSESHIT" tag for the story about 10 seconds later.

    • by Anonymous Coward

      The government is not simply asking them to hand over the encryption keys, but to write and deploy code on its behalf. That would make Apple an agent of the government; if it can do that, it can make any company such an agent. What's to stop the government from commanding Apple or Microsoft to deploy code that allows them to listen through a computer's microphone? Or how about vendors of "smart" TVs: can the government command them them to install cameras and microphones in all their new models, which th

    • Specifically:

      ...Apple has helped in countless cases...

      is horseshit. This idiot should learn the basic facts before opening his trap. I read court filings on this just in a past few days (widely reported) that include the information that Apple has never provided this type of assistance.

      There are currently over a dozen cases where the government is attempting to use the All Writs Act to force Apple to do this sort of work; all of those are pending and under challenge or appeal.

      Zero is not "countless" even fo

    • At the next data security conference, let's poll the experts on this question. Let's see what percentage of FBI supporters we get.

  • I disagree (Score:5, Insightful)

    by jtownatpunk.net ( 245670 ) on Thursday March 03, 2016 @09:51PM (#51633743)

    Once the tool/method is created, it exists. Even if the tool never leaves Apple, they could be compelled to use the tool in future cases. Tool.

    • Re:I disagree (Score:5, Insightful)

      by Anonymous Coward on Thursday March 03, 2016 @09:56PM (#51633767)

      You're mistaken in your logic.

      The fact that the tool could exist means that the design of the iPhone is itself compromised, so they could be compelled, or somebody could unlawfully construct a version on their own.

      That means you need to concern yourself with a real secure design instead of fighting over this issue.

      Don't sacrifice a Queen to save a pawn.

      • Of course. Because once you snap your fingers and improved security is added to the next version of the product, it will magically propagate backwards through time to the millions of devices that are currently in use.

        • Of course. Because once you snap your fingers and improved security is added to the next version of the product, it will magically propagate backwards through time to the millions of devices that are currently in use.

          You never had a windows forced update bro? Trivial to install a backdoor with an update.

      • Can't this problem be mostly circumvented by using a secure password instead of a 4-digit passcode that can be easily brute-forced? And obviously this password can be accessed via a fingerprint using TouchID for convenience.
    • But the tool wouldn't work on newer iPhones. I think maybe what the guy is trying to say is that it would make a better test case if Apple truly had no way into the phone.
      • by hawguy ( 1600213 )

        But the tool wouldn't work on newer iPhones. I think maybe what the guy is trying to say is that it would make a better test case if Apple truly had no way into the phone.

        how would that be a case at all?

        FBI: Apple, we need you to unlock this phone.

        Apple: We can't.

        FBI: No really, do it. Babies might die and stuff.

        Apple: No, really, we can't, here's why.

        FBI: Oh ok. Well you shouldn't have built a baby killing phone.

    • Re:I disagree (Score:5, Insightful)

      by Daniel Matthews ( 4112743 ) on Thursday March 03, 2016 @11:05PM (#51634137)
      The capability to create it already exists at Apple and so if they do make it, use it for this case and destroy it afterwards, you just end up back at the start, where the capability to create it exists and they are no more or less likely to be coerced into doing it by any other party. They shot themselves in the foot when they indicated so publicly that they could, but would not do it. Furthermore if you claim that they can't make and then securely destroy such tools you are also claiming that they can't securely do anything and the iPhone isn't secure. You can't have it both ways. Anyway the phone can be cracked, it would just cost a lot more money than if Apple did it. All Apple has to do is take the phone into a clean room with the equipment needed, get the codes, hand them and the phone back to the FBI and then completely destroy the contents of the clean room. Apple just do not want to do it because it devalues their product in the eyes of those who have secrets and naively believe that no other party can crack the iPhone.
      • Re:I disagree (Score:5, Insightful)

        by bitingduck ( 810730 ) on Friday March 04, 2016 @12:04AM (#51634429) Homepage

        The capability to create it already exists at Apple and so if they do make it, use it for this case and destroy it afterwards, you just end up back at the start, where the capability to create it exists and they are no more or less likely to be coerced into doing it by any other party.

        That shows a profound misunderstanding of how the US legal system works. Once they've done it, then the probability of them being coerced again is identically 1. If the gov't is allowed to compel them to produce software, and especially produce particular features, then the government can:
        a) repeatedly compel them to recreate the software to crack existing phones that can be cracked by that method. Then apple effectively has to either maintain a team to keep recreating and destroying the software (good luck hiring people who want that job. seriously tiresome) or keep the software intact and protect it. But they can't do that, because once it gets used in an actual criminal prosecution then the defendant will have the right to see the software. And every defendant it's used against will have that right. So then it's out.
        b) compel them to create a permanent backdoor in all future versions (the precedent for government compulsion of particular features having been established, despite CALEA's wording to the contrary. And they can do it secretly through the FISA court, and it will be 5-10 years before we hear about it publicly. In the meantime, people will find the holes and exploit them (aside from NSA and FBI exploting them).

        The technical possibility of that particular phone being hackable by sideloading a custom system is almost irrelevant to the case. It's the legal precedent that's important.

      • Comment removed based on user account deletion
    • Once the tool/method is created, it exists. Even if the tool never leaves Apple, they could be compelled to use the tool in future cases. Tool.

      If Apple really cared about security they would do the following:

      1. Admit they screwed up, announce their system is vulnerable and apologize to all affected users.

      2. Fix vulnerability such that security of users systems no longer hinge on whether a circumvention tool exists.

      Customers should demand security live up to advertising and stand alone without unnecessary conditionality.

      This particular fight is suicide for those who care about security and privacy for obvious political reasons. Neither does this m

  • needs to be beaten to a bloody pulp.
  • The case is in front a former AUSA (i.e. lots of experience on the government side), but she went to Williams College for undergrad which means she's probably one of the more intelligent federal judges--making her likely to read and understand the tech industry's briefs. (About half of federal judges are really smart and went to top schools; about half of them may not be as smart but have been successful politically. They all have a good measure of experience.)

    Ultimately, of course, the case is likely to

  • by penguinoid ( 724646 ) on Thursday March 03, 2016 @09:57PM (#51633773) Homepage Journal

    If you comply once, then you greatly weaken any objections to complying again.

    • by mysidia ( 191772 )

      If you comply once, then you greatly weaken any objections to complying again.

      If Apple wants to strengthen their objections; they should probably initiate a process of rotating/replacing their code signing keys on new/existing devices with an emergency software update.

      Then once the vast majority of devices have updated, initiate the process of expeditiously destroying the previous key material.

      At that point, they will be incapable of signing a custom firmware which the old device will recognize,

      • Only if they want a contempt of court charge. While they are allowed to challenge orders they are not allowed to take steps which would prevent them doing what the court has ordered now that they are subject to a court order.

        If you routinely destroy all correspondence after x days, you won't be in contempt if the court asks you for something that has been destroyed. If you destroy it after you have been asked for it you will be.

  • by BarbaraHudson ( 3785311 ) <<barbara.jane.hudson> <at> <icloud.com>> on Thursday March 03, 2016 @09:59PM (#51633789) Journal
    The perps are dead, so there is no need to decrypt their phones in that case.

    This is just fishing for information on other people, and it's pretty naive, since they destroyed 2 other phones. Would you use your company phone to plan a terrorist act?

    Also, the phone isn't Apple's property. Let them go after the entity that owns the phone.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Thursday March 03, 2016 @10:04PM (#51633811)
    Comment removed based on user account deletion
  • It has nothing to do with placing trapdoors on millions of phones around the world...

    Yes, actually it does. And here's why:

    If Apple goes ahead and does this, what happens to the code after the FBI has used it? What guarantees can possibly be made that the code will not get leaked? (if recent events have taught us anything, it is that secrets eventually get discovered) If Apple develops this code, and this code should *EVER* make its way outside of Apple, where some particularly tenacious individual

    • If Apple goes ahead and does this, what happens to the code after the FBI has used it?

      It doesnt matter what happens to it. it doesnt matter if the FBI then deletes it. It doesnt matter if the FBI then keeps it. It doesnt even matter if the FBI puts it on the pirate bay later so that every hacker has it.

      Asking questions is not an argument. For instance, "If apple doesn't go ahead and do this, is this shitty insecure phone more secure?"

      The problem remains that the devices are not secure. Full stop.

  • "Shoot to kill, sir? Are you sure?"
    "Orders from on high."
    "But then we can't grill em for more info"
    "Yeah, but we'd be able to access encrypted iphone inpho"
    The plot sickens.

  • Cool! Now, where is the link to his statement?

    "The FBI should be given the method to crack iPhones" - Abraham Lincoln

  • There is ample and conspicuous proof that the US government can not hold information securely. Given a method to break into Apple phones, it is quite likely that the information would be exposed to the world and ruin the sales of Apple phones. Further, any defense team would have to be allowed to hire experts to opine on whether a real decoding had taken place and whether the decoding was accurate or to what degree errors occurred. Over time, that means that quite a few people w
  • Apple should stand firm for the EXACT reasons listed. This case very much is slanted towards the FBI.

    If Apple wins, they'll set the strongest possible precedent. If they lose, deniability is built in: we lost because the case was so obviously slanted, we'll bring a case to court again if a less slanted case shows up.

  • What he said is precisely why the FBI picked this one. I think he was plenty smart in his youth, but now Shamir is just another silly old fool. Does he seriously think that the FBI won't use every wedge issue to outlaw encryption? After all, wanting to have ANY secret from the government PROVES you're up to no good.

    When encryption is outlawed, only outlaws will have encryption. By circular definition.

  • One thing I find confusing is that everyone talks of forcing Apple to make a new version. But it isn't Apple but Apple engineers doing this work. Does the court have the power to tell Apple to fire it's operating system developers if they don't comply?

  • The shooter destroyed everything with evidence on it. This phone was untouched. Guess why.
  • The people killed are not guilty, they were killed before the court could adjudicate the matter. In the USA you are incident until proven guilty in a court of law, by a jury of your peers.

  • Comment removed based on user account deletion
  • Aside from physical security breaching (that is, shaving off the chips): if Apple can't do it, they should say so. However, if Apple can do it (and it looks like they can), then they should do it (and then build an even stronger phone).

    So millions of people bought phones that were secure only to a certain level - well tough, that's just how it is. Purposeful breaking of security is a must when it comes to designing security. Plus, millions of people don't expect to have an ultimately secure phone either - t

    • +1 so damn obvious

      Remember the first round of spin on the topic. Everyone that now defends Apple with the self-determination argument was a week ago defending Apple on the "create backdoor" argument. Their argument changes (because facts got in the way), the goalpost shifted (damn facts), but their conclusion remains the same (well how about that.)

      The phone is insecure. Its entire security rests on a number between 0 and 9999. Dont give Apple a pass here. They dont deserve the pass, and playing make-bel
  • At the risk of a suggestion that may already have been beaten to death and shown impractical: is it possible to expose the logic boards without removing power, and dropping sockets on the SDRAM and NAND memory to enable reading out the contents? Write contents into a suitably configured iOS emulator, and thereby get as many brute force PIN guesses as you need?
    • Well crap, shoulda/woulda/coulda done a bit more reading on technical sites before the brain dump: reading out memory without the hardware key means the Bureau's lab would be trying to brute-force a 256 bit AES key instead of a 4 digit PIN. Never mind, bring on the panopticon.
  • Apparently he hasn't actually bothered to educate himself about what Apple did do and what the FBI is asking for in addition. A little education goes a long ways. In S's case there appears to be none applied.

  • The guy had the gun and killed the people... how much more proof do they need? The phone didn't have anything to do with killing people. Why do they need in?
  • Isn't it one of the immutable laws of security that physical possession means the device is owned? Apple is trying to make this not so immutable.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...