Godfather Of Encryption Explains Why Apple Should Help The FBI (bgr.com) 293
An anonymous reader writes: Famed cryptographer and Turing Award winner, Adi Shamir, has an interesting if not surprising take on Apple's current legal tussle with the FBI. While speaking on a panel at RSA Conference 2016 earlier this week, the man who helped co-invent the vaunted RSA algorithm (he's the 'S' in RSA) explained why he sides with the FBI as it pertains to the San Bernardino shooter's locked iPhone. It has nothing to do with placing trapdoors on millions of phones around the world," Shamir explained. "This is a case where it's clear those people are guilty. They are dead; their constitutional rights are not involved. This is a major crime where 14 people were killed. The phone is intact. All of this aligns in favor of the FBI." Shamir continued, "even though Apple has helped in countless cases, they decided not to comply this time. My advice is that they comply this time and wait for a better test case to fight where the case is not so clearly in favor of the FBI."
What a crock (Score:5, Insightful)
What a crock full of shit.
Re:What a crock (Score:5, Insightful)
And from someone who really ought to know better.
What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.
Waiting for a future, better case would sure end up looking foolish when the government argues, "What's the problem? You agreed to do this exact same thing before, in the San Bernardino case..."
Re:What a crock (Score:5, Funny)
That's only one opinion. Ron Rivest, the R in RSA, disagrees with Shamir over this one. As the S in the abbreviation, Shamir is the man in the middle.
Re: (Score:2, Funny)
That's only one opinion. Ron Rivest, the R in RSA, disagrees with Shamir over this one. As the S in the abbreviation, Shamir is the man in the middle.
So you're saying that this is a man-in-the-middle attack?
Re:What a crock (Score:5, Insightful)
And from someone who really ought to know better.
What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.
Waiting for a future, better case would sure end up looking foolish when the government argues, "What's the problem? You agreed to do this exact same thing before, in the San Bernardino case..."
Should he know better? I'm not sure. On one hand, Shamir is really good at math. But math has almost nothing to do with Constitutional law, which is what this is really about.
There's a big difference between who can create/build a certain technology, and who should be trusted with knowing how and when to use it. A lot of people conflate the two, but they are incredibly different skillsets.
Re: What a crock (Score:3, Insightful)
More importantly, knowledge of math has nothing to do with right and wrong.
Re: What a crock (Score:2)
But the math does say you can build a secure phone where only the owner has the key. Ask anyone who's ever lost a bitcoin wallet. Apple chose to retain that key making it a political issues, not a technical one.
Re: (Score:3, Insightful)
But the math does say you can build a secure phone where only the owner has the key.
No, the math says the phone is secure if only the owner has the key.
Apple chose to retain that key making it a political issues, not a technical one.
Again, not quite. Apple wrote the operating system that allows the owner sole access to the key, and they can rewrite the OS to violate that exclusivity. Whether they can be forced to retroactively modify their OS to expose their customers' private data is the political issue.
Re: (Score:2)
Should he know better? I'm not sure. On one hand, Shamir is really good at math. But math has almost nothing to do with Constitutional law...
This involves primarily two things; philosophical identity, and equals/not equals. Both of those are taught to mathematicians. Identity is simply the concept that a thing exists. We're talking about an abstract thing, specifically a set of actions that together are the "help" that the FBI wants from Apple. Then, we have equals/not equals. How many times has "help" that the FBI asks for been done? None.
It is exactly counting that he no longer comprehends. I recommend that he get a brain scan to check for tum
Re: (Score:3)
"The notion that this is something only about opening one lock or that there is some degree of locks that can't be opened with the tool that they're asking us to create, is a misnomer," Sewell added.
Apple evidently worries that the FBI will keep whatever tool Apple creates, and use it to break into other iPhones in other cases. Those cases are most likely not always legit, and there is a further risk of the tool getting into the hands of others. At
Re: (Score:3)
US Constitutional law, specifically. Shamir is an Israeli, so it's natural to expect that he's going to balance the rights of individual Americans versus the state a wee bit differently than those who have to live with the consequences.
Re: (Score:3)
It's not just about Constitutional law, but also about security principles. My sense of his argument is that he's saying, "You may want to put up a fight some other time, but in this instance, you know that they're guilty, so you should do what the government is asking." In other words, "Security is important, but in this one instance the person should probably be authorized, so build a new backdoor into the system for that person that allows them to circumvent your security protocols."
If this guy is any
Re:What a crock (Score:5, Insightful)
You want a good case which burns away irrelevant fluff and centers the issue solely on the principles at stake. If you pick a bad case, the court could decide against you based on that fluff, inadvertently setting a precedent which influences the balance of those principles. The principles at stake in this case (or what people are hoping this case will be about) are an individual's right to privacy and a company's right of self-determination against government coercion, vs. the government's duty to keep society safe. Here are the flaws I can see with this case:
All the points I listed above can be used to refute that argument. That's why this is a bad case. Heck, even the recent New York case (defendant is a drug dealer, but he is the phone's owner, and he didn't kill anyone) is a better case.
Re: (Score:2)
Anybody willing to blow a whistle here?
Re: (Score:3)
What this tells me is that being the "Godfather of Encryption" is not mutually exclusive with being a dunce on operational security.
My guess he realizes Apples argument is nonsensical gibberish.
My data is only secure so long as someone does not create a tool after the fact to compromise it... What kind of bullshit is that? Why is this technically ridiculous position worth defending politically in front of millions of people scared out of their minds of terrorists and whom do not know jack about underlying technology?
The FBI is laughing their assess off right now at the stupidity of the tech industry. Apple is succeeding in re-railing
Re: (Score:2)
Apple doesn't want there to be a precedent set that says the forced creation of software to crack their own phones is not an undue burden. If they lose then any future security improvements they make won't get them off the hook, the court will just say "you did this to yourself, now expend the necessary resources to undo it".
It's a shame they didn't do it properly in the first place so that their claim of not being able to unlock the device was actually true. Then there wouldn't be a risk of this happening
Re:What a crock (Score:5, Insightful)
To wit the answer must be: "We re-architected our security hardware and no longer possess the *capability* to circumvent query limit and delay."
They probably already did this in the 5S and later. The 5C does not have the "secure enclave" chip, which means there is no secure hardware on the device, all of the security is implemented in software.
Also, it should be pointed out that the signed software approach is quite secure against anyone who doesn't have the software signing keys, or the capability to compel the holder of the signing keys. That's actually just fine if the goal is to defend the data from access by private snoops (e.g. spouses, even technically sophisticated ones, corporate espionage, etc.) and criminals.
Consumer devices will never be secure against state actors with unlimited funds and sufficient motivation (e.g. NSA, GCHQ, Mossad, etc.), so really the only cases where the approach is inadequate are cases where (a) the owner of the signing keys (Apple) wants the data or (b) a government with the power to compel the owner of the signing keys wants the data. Apple has no reason to prevent their own ability to circumvent (though they do need to protect against use of this ability by rogue employees), since they know their financial interest is strongly on the side of securing the data, and legitimate companies generally do not expend effort on securing data against law enforcement officials of democratic regimes that have due process and rule of law.
Indeed, although the 5S and the 6-series probably do make it impossible for Apple to comply with similar demands for those devices, I really doubt that was the rationale for adding the security chip. I'm the lead engineer for similar components on Android, and while I've been pushing to include a secure element chip for some time, the rationale has never been to keep US courts from being able to compel access, it's always been about strengthening security against corporate espionage (which signed software solutions do address, but not completely) and to make penetration costlier for oppressive governments and intelligence services. I say "costlier" since they can't really be kept out completely.
I'm not certain that the secure enclave actually keeps Apple from being able to comply with this sort of request, either. I expect that the software in that device is also field-upgradable, since there are compelling practical and security reasons for enabling upgrades. Bugs are always a risk, and being able to fix them is a really good thing. But if the software can be upgraded, then it can be "upgraded" to remove security features. This can be limited in various ways; it's common, for example, to have secret keys burned into hardware which simply cannot be extracted by software because the software never has direct access to them, and any security that derives from the secrecy of such keys can't be subverted by software changes. But brute force mitigation possibly can be upgraded away, even with the secure enclave chip.
The bottom line here is that these are really hard engineering problems. Not that it's hard to design so that key components are non-updatable... that's easy. But it's also very risky, because it leaves you without any options when said components turn out to have problems. I think it's flat wrong to characterize Apple engineers' failure to secure the 5C against Apple as any kind of incompetence, which is your clear implication.
Re: (Score:2)
I can see the point he's making - that this particular case is not the best one put all your chips on.
Re:What a crockA (Score:2)
His contributions to encryption and privacy can not be underestimated, but it seems everyone has their faux pas moments. The implications with this approach are a bit similar to those of torture; it may be used "just this once", but then again, you can only sell once the core values of a democratic society.
Re: (Score:2)
Re: (Score:2)
So you didn't just fall off a potato wagon, either? Amazing coincidence, that. :)
It's not even "wrong"--it's *disingenuous* to a degree which resulted in near-instant coffee spatters on my monitor and a "HORSESHIT" tag for the story about 10 seconds later.
Re:here's why it's a crock (Score:2, Insightful)
The government is not simply asking them to hand over the encryption keys, but to write and deploy code on its behalf. That would make Apple an agent of the government; if it can do that, it can make any company such an agent. What's to stop the government from commanding Apple or Microsoft to deploy code that allows them to listen through a computer's microphone? Or how about vendors of "smart" TVs: can the government command them them to install cameras and microphones in all their new models, which th
Re: (Score:3)
Specifically:
is horseshit. This idiot should learn the basic facts before opening his trap. I read court filings on this just in a past few days (widely reported) that include the information that Apple has never provided this type of assistance.
There are currently over a dozen cases where the government is attempting to use the All Writs Act to force Apple to do this sort of work; all of those are pending and under challenge or appeal.
Zero is not "countless" even fo
Re: (Score:2)
At the next data security conference, let's poll the experts on this question. Let's see what percentage of FBI supporters we get.
Re: (Score:3, Insightful)
Only if you're an idiot. It's only about the one phone until precedent is made.
Yep. (Score:5, Insightful)
The ability to spy on law-abiding citizens grants an enormous economic advantage to those in power. They can do the financial equivalent of insider trading with impunity, and rake it in. That is just one way of many.
Strong encryption gives the poor some leverage against the rich; a chance to reclaim and protect some of the wealth that they generate through their labor. I guarantee, the rich will never abide this. Even if Apple wins, subsequent political and technological maneuvering will ultimately result in strong encryption available to the rich, but not to the rest of us.
This case is a lot like the presidential election: no matter who wins, we lose.
Re: What a crock (Score:5, Interesting)
Only if you're an idiot. It's only about the one phone until precedent is made.
This.
And Apple's statement to this effect has already been proven true...for there are multiple cases where the FBI has asked for "just this one phone" to be unlocked in this manner. There are literally more than a dozen parallel efforts, in addition to this one particularly high-profile one, to get this to happen.
Re: What a crock (Score:5, Interesting)
Shamir is also being disingenuous when he said, "even though Apple has helped in countless cases, they decided not to comply this time." Apple's cooperation in the prior cases was in recovering unencrypted data. They have never provided a way to decrypt data when they don't have the keys, or recover keys locked in the secure enclave.
Re: (Score:2)
FBI is saying to Apple "Just the tip baby, I swear".
Re: (Score:3)
They're currently asking for this help on over a dozen iPhones, it is only one phone per case but it is not and never was about a single phone, even before any new precedent. Indeed, this weeks ruling from NY went into that and the inaccuracy of the claim that it only involves one phone.
Re: What a crock (Score:5, Insightful)
No, it's about the other 14 iPhones the FBI has lined up behind it, and the 175 iPhones New York city has after those, and so on.
Re: What a crock (Score:5, Informative)
The fbi is willing to let APL control everything in this particular case:
Wrong.
Look up the actual court order.
The text of the court order instructs APL to place the tool on a hard drive and give it to the FBI to use.
Are you stupid or are you a shill?
Of course the two are not mutually exclusive, and as often as not, correlate strongly. Particularly when the shills work for government.
Strat
Re: (Score:3)
One day , just think from the victims families point of view . it's just about one iPhone data .
You say that as if there wouldn't be victims of government overreach.
Re: (Score:3)
"‘Take this tool and put it on a hard drive, send it to the FBI,’ and they’d load it onto their computer,” http://nypost.com/2016/03/02/f... [nypost.com] (March 2, 2016)
Re: (Score:3)
That article (and theory) is wrong. Apple could easily write a very small patch that would run on 1 phone (tied to the unique ID). And then just like every other OS update, it's signed by Apple's secret key. If anyone changes 1 or more bytes, the phone won't accept the patch. It will never run on any other phone.
What I think you and many others fail to understand is that once Apple signs the firmware for the San Bernadino phone, they have created precedence, and the TLAs will demand that they do the same over and over again for any number of other phones. Or even for a general firmware patch pushed through to all users.
If Apple no longer can choose what they sign and don't sign, it's the feds running the show. It's as bad as handing the feds the signing key, but even a bit worse from Apple's point of view, as the
Re: (Score:3)
One day , just think from the victims families point of view .
You mean the families that ALSO say it is wrong for the FBI to have the ability to decrypt iPhones world over?
http://news.slashdot.org/story... [slashdot.org]
it's just about one iPhone data
Repeating a lie, that you are damn well aware is a lie, over and over will not make it any more true.
Re: (Score:3)
Not to mention that the victims are dead, we already know who killed them, the killers themselves are also dead, and cracking an iPhone is not going to bring any of them back to life.
Re: (Score:2)
Re: What a crock (Score:4, Interesting)
So what is that NSA meta data program for then?
Re: (Score:2)
Harsh, but true, nevertheless.
Re: (Score:2)
the victims" families appear to fall within the set of currently living people and therefore according to your position, they matter too.
How do those families benefit by viewing whatever data is stored on the phone? Or maybe better: what data could be stored on the phone that would benefit those 14 families to a greater extent than the harm done to 700,000,000 iPhone owners?
Stored data I imagine might include a manifesto, might include the text of messages exchanged with co-conspirators, might include a map to a buried nuclear bomb set to go off in 24 hours. A manifesto is not very useful. Potential co-conspirators can already be identifi
Re:What a crock (Score:5, Interesting)
Well to be accurate he is on the payroll of Weizmann Institute of Science https://en.wikipedia.org/wiki/... [wikipedia.org]. So rather than being on the FBI payroll he is more likely on the Mossad payroll and actually spends most of his time specialising in "Differential cryptanalysis is a general form of cryptanalysis applicable primarily to block ciphers, but also to stream ciphers and cryptographic hash functions. In the broadest sense, it is the study of how differences in information input can affect the resultant difference at the output. In the case of a block cipher, it refers to a set of techniques for tracing differences through the network of transformations, discovering where the cipher exhibits non-random behaviour, and exploiting such properties to recover the secret key." https://en.wikipedia.org/wiki/... [wikipedia.org] ie breaking encryption. So yeah, self serving dick wants to make his life easier, so basically what ever he says, do the fucking opposite and do not trust anything from him.
Re: (Score:2)
Well to be accurate he is on the payroll of Weizmann Institute of Science [...]
History, it seems, is not without sense of irony. [wpi.edu] My, how things have changed in the last 30 years.
Re: (Score:2)
He's Allice, she's Bob.
I disagree (Score:5, Insightful)
Once the tool/method is created, it exists. Even if the tool never leaves Apple, they could be compelled to use the tool in future cases. Tool.
Re:I disagree (Score:5, Insightful)
You're mistaken in your logic.
The fact that the tool could exist means that the design of the iPhone is itself compromised, so they could be compelled, or somebody could unlawfully construct a version on their own.
That means you need to concern yourself with a real secure design instead of fighting over this issue.
Don't sacrifice a Queen to save a pawn.
Re: (Score:2)
Of course. Because once you snap your fingers and improved security is added to the next version of the product, it will magically propagate backwards through time to the millions of devices that are currently in use.
Re: (Score:2)
Of course. Because once you snap your fingers and improved security is added to the next version of the product, it will magically propagate backwards through time to the millions of devices that are currently in use.
You never had a windows forced update bro? Trivial to install a backdoor with an update.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
But the tool wouldn't work on newer iPhones. I think maybe what the guy is trying to say is that it would make a better test case if Apple truly had no way into the phone.
how would that be a case at all?
FBI: Apple, we need you to unlock this phone.
Apple: We can't.
FBI: No really, do it. Babies might die and stuff.
Apple: No, really, we can't, here's why.
FBI: Oh ok. Well you shouldn't have built a baby killing phone.
Re:I disagree (Score:5, Insightful)
Re:I disagree (Score:5, Insightful)
The capability to create it already exists at Apple and so if they do make it, use it for this case and destroy it afterwards, you just end up back at the start, where the capability to create it exists and they are no more or less likely to be coerced into doing it by any other party.
That shows a profound misunderstanding of how the US legal system works. Once they've done it, then the probability of them being coerced again is identically 1. If the gov't is allowed to compel them to produce software, and especially produce particular features, then the government can:
a) repeatedly compel them to recreate the software to crack existing phones that can be cracked by that method. Then apple effectively has to either maintain a team to keep recreating and destroying the software (good luck hiring people who want that job. seriously tiresome) or keep the software intact and protect it. But they can't do that, because once it gets used in an actual criminal prosecution then the defendant will have the right to see the software. And every defendant it's used against will have that right. So then it's out.
b) compel them to create a permanent backdoor in all future versions (the precedent for government compulsion of particular features having been established, despite CALEA's wording to the contrary. And they can do it secretly through the FISA court, and it will be 5-10 years before we hear about it publicly. In the meantime, people will find the holes and exploit them (aside from NSA and FBI exploting them).
The technical possibility of that particular phone being hackable by sideloading a custom system is almost irrelevant to the case. It's the legal precedent that's important.
Re: (Score:2)
Re: (Score:2)
Once the tool/method is created, it exists. Even if the tool never leaves Apple, they could be compelled to use the tool in future cases. Tool.
If Apple really cared about security they would do the following:
1. Admit they screwed up, announce their system is vulnerable and apologize to all affected users.
2. Fix vulnerability such that security of users systems no longer hinge on whether a circumvention tool exists.
Customers should demand security live up to advertising and stand alone without unnecessary conditionality.
This particular fight is suicide for those who care about security and privacy for obvious political reasons. Neither does this m
Sometimes the dromedary's proboscis (Score:2, Troll)
The Judge... (Score:2)
The case is in front a former AUSA (i.e. lots of experience on the government side), but she went to Williams College for undergrad which means she's probably one of the more intelligent federal judges--making her likely to read and understand the tech industry's briefs. (About half of federal judges are really smart and went to top schools; about half of them may not be as smart but have been successful politically. They all have a good measure of experience.)
Ultimately, of course, the case is likely to
There won't be a better test case (Score:5, Insightful)
If you comply once, then you greatly weaken any objections to complying again.
Re: (Score:2)
If you comply once, then you greatly weaken any objections to complying again.
If Apple wants to strengthen their objections; they should probably initiate a process of rotating/replacing their code signing keys on new/existing devices with an emergency software update.
Then once the vast majority of devices have updated, initiate the process of expeditiously destroying the previous key material.
At that point, they will be incapable of signing a custom firmware which the old device will recognize,
Re: (Score:2)
Only if they want a contempt of court charge. While they are allowed to challenge orders they are not allowed to take steps which would prevent them doing what the court has ordered now that they are subject to a court order.
If you routinely destroy all correspondence after x days, you won't be in contempt if the court asks you for something that has been destroyed. If you destroy it after you have been asked for it you will be.
Re:There won't be a better test case (Score:4, Interesting)
The summary is very misleading. Apple's compliance has only been in recovering unprotected data. They have never provided access into the Secure Enclave to recover the keys, and have never recovered data encrypted by those keys before.
The FBI hopes that by whipping up national hatred for these mass murderers it will spark a public outcry in favor of forcing vendors to provide defective encryption, U.S. government access to escrow keys, or other back door. Many Americans have been taught by the fear-mongers running the talk radio business to be so craven that they'll agree to any violation of anyone's rights because 'terrorists'.
Re: (Score:2)
Not necessarily. It's one persons opinion. This is why there is a appeals system.
Totally BS argument. (Score:5, Insightful)
This is just fishing for information on other people, and it's pretty naive, since they destroyed 2 other phones. Would you use your company phone to plan a terrorist act?
Also, the phone isn't Apple's property. Let them go after the entity that owns the phone.
Re: (Score:2)
FYI, the current design is already the future design you are looking for. All models that have touch id use hardware encryption, and can not be hacked with an OS update. The only reason Apple could hack the iPhone in question in the first place is because it is an older model.
Re: (Score:2)
There are different laws that prevent law enforcement from gathering firearm purchase data. If you really want to see a legal shit storm then piss off the NRA for creating a de facto gun registry.
Comment removed (Score:5, Informative)
Re:Adi "IANAL" Shamir (Score:5, Insightful)
Maybe they actually do need to get into the phone, if so why did the FBI reset his account password so that even Apple could not get into his account and unlock his phone? They are using this specific case ("Oh my god Terrorrists!") because they are betting on public sentiment and morons like you to set a legal precedent that they can then use to beat the shit out of anyone who does not decrypt data for them. If you think the FBI are doing this for any other reason then you obviously ate wall candy as a baby (ie. lead paint, since I doubt you would figure it out).
He's not thinking of the big picture (Score:2, Insightful)
Yes, actually it does. And here's why:
If Apple goes ahead and does this, what happens to the code after the FBI has used it? What guarantees can possibly be made that the code will not get leaked? (if recent events have taught us anything, it is that secrets eventually get discovered) If Apple develops this code, and this code should *EVER* make its way outside of Apple, where some particularly tenacious individual
Re: (Score:2)
If Apple goes ahead and does this, what happens to the code after the FBI has used it?
It doesnt matter what happens to it. it doesnt matter if the FBI then deletes it. It doesnt matter if the FBI then keeps it. It doesnt even matter if the FBI puts it on the pirate bay later so that every hacker has it.
Asking questions is not an argument. For instance, "If apple doesn't go ahead and do this, is this shitty insecure phone more secure?"
The problem remains that the devices are not secure. Full stop.
Re: (Score:2)
What if the sequence of events is different? (Score:2)
"Shoot to kill, sir? Are you sure?"
"Orders from on high."
"But then we can't grill em for more info"
"Yeah, but we'd be able to access encrypted iphone inpho"
The plot sickens.
Charles Babbage weighed in? (Score:2)
Cool! Now, where is the link to his statement?
"The FBI should be given the method to crack iPhones" - Abraham Lincoln
Apple Is Right (Score:2)
He just made the case against ... (Score:2)
If Apple wins, they'll set the strongest possible precedent. If they lose, deniability is built in: we lost because the case was so obviously slanted, we'll bring a case to court again if a less slanted case shows up.
No fool like an old fool (Score:2)
What he said is precisely why the FBI picked this one. I think he was plenty smart in his youth, but now Shamir is just another silly old fool. Does he seriously think that the FBI won't use every wedge issue to outlaw encryption? After all, wanting to have ANY secret from the government PROVES you're up to no good.
When encryption is outlawed, only outlaws will have encryption. By circular definition.
No fool like a yank (Score:2)
Making a big deal about his "rights" while completely missing the bigger picture.
Engineers (Score:2)
One thing I find confusing is that everyone talks of forcing Apple to make a new version. But it isn't Apple but Apple engineers doing this work. Does the court have the power to tell Apple to fire it's operating system developers if they don't comply?
"The phone is intact" (Score:2)
Not Guilty (Score:2)
The people killed are not guilty, they were killed before the court could adjudicate the matter. In the USA you are incident until proven guilty in a court of law, by a jury of your peers.
Re: (Score:2)
I kind-of agree with Shamir (Score:2)
Aside from physical security breaching (that is, shaving off the chips): if Apple can't do it, they should say so. However, if Apple can do it (and it looks like they can), then they should do it (and then build an even stronger phone).
So millions of people bought phones that were secure only to a certain level - well tough, that's just how it is. Purposeful breaking of security is a must when it comes to designing security. Plus, millions of people don't expect to have an ultimately secure phone either - t
Re: (Score:2)
Remember the first round of spin on the topic. Everyone that now defends Apple with the self-determination argument was a week ago defending Apple on the "create backdoor" argument. Their argument changes (because facts got in the way), the goalpost shifted (damn facts), but their conclusion remains the same (well how about that.)
The phone is insecure. Its entire security rests on a number between 0 and 9999. Dont give Apple a pass here. They dont deserve the pass, and playing make-bel
Why Not Read Out Memory? (Score:2)
Re: Why Not Read Out Memory? (Score:2)
Out of the loop (Score:2)
Apparently he hasn't actually bothered to educate himself about what Apple did do and what the FBI is asking for in addition. A little education goes a long ways. In S's case there appears to be none applied.
But wait (Score:2)
Immutable? (Score:2)
Re: (Score:2, Informative)
Yes:
http://www.cnet.com/news/security-firm-rsa-took-millions-from-nsa-report/
Re: (Score:2)
ad hominem attack.
The culture at the fbi has become tyrannical. They want to be the next KGB. I'll pass.
Re:What a bunch of ignorance (Score:5, Informative)
Why does everyone think Apple has to create anything new? They already have the ability to do what the FBI wants. It's not a backdoor, it's not something they have to use on every phone...it's a simple code adjustment to turn off the poison pill and can easily be pushed to this one single phone. In fact, it can be built specifically for this one phone and it will only work on the one phone. Due to the way Apple already does their updates, they do this already as it is. They don't do mass updates to apps and iOS to all phones. each phone is unique and has it's own nonce. that's all Apple needs to match this code up to.
This isn't a technical issue. It's about people's opinion's on whether these douchebags have rights still and whether this actually violates them.
***Spoiler Alert*** They don't.
You don't seem to understand how slippery slopes work.
It's not "just one phone", and never was, it started at one and only one phone, because you know, terrorism, we need to read the phone of just this one terrorist and Apple won't help us! Then "Well there may be a dozen others that we'd like to break into". Then "Law enforcement agencies possess hundreds, or even thousands of phones they'd like to break into". And somewhere between "dozens" and "thousands", it becomes too unwieldy for the government to wait for Apple to unlock each one, so they'll require the tools to do it on their own.
And once they've proven that they can force Apple to create software at their bidding, they'll easily be able to force Apple to hand over the tools they need to decrypt phones at will. And really, there's no end to what they can force Apple to hack into their phones.
Re:What a bunch of ignorance (Score:5, Insightful)
You don't seem to understand how slippery slopes work.
If the FBI succeeds on this one, there will be a point in the future where some prosecutor argues in court that nobody has a reasonable expectation of privacy in their smartphones, in part because society at large was okay with how this case went down.
The frightening part is that the argument might work.
Re: (Score:2)
I do. But then I work for Apple.
Apple pay the bills on my phone, but it's my phone.
Re: (Score:2)
Re: (Score:2)
No, this is about setting dangerous precedence. If the phone is cryptoed properly, apple cannot break it without brute forcing.
Re: (Score:2)
Why does everyone think Apple has to create anything new? They already have the ability to do what the FBI wants. It's not a backdoor, it's not something they have to use on every phone...it's a simple code adjustment to turn off the poison pill and can easily be pushed to this one single phone.
FBI is also asking for the ability to enter PIN codes over the lightning cable or over bluetooth. This functionality isn't present at all. Apple would have to invent a new handshake/protocol/whatever for it.
Re: (Score:2)
If anyone really, really want to crack an iPhone, they'd do it.
Like the FBI?
Re:Adi's correct (Score:5, Insightful)
The problem comes when the FBI compels/orders Apple to build a 2nd operating system. Forcing and compelling people and companies who are not accused of a crime is un-American and that is why this is going to court. Wanna compel Apple? Fine, go to Congress and pass a law like CALEA. But lets be clear.....a law forcing Apple to do what the FBI wants does not currently exist and that's why the FBI is relying on the All Writs Act to force Apple to do it.
Nobody has ever suggested the FBI (or anyone else for that matter) is prohibited from hacking the phone. They aren't. They are more than welcome to use whatever resources they have to hack it. But those resources do not include Apple, the company, or any of it's employees or tools unless allowed by law.
Re: (Score:2)
Its always about the next case and getting US brands staff used to been conscripted by the feds. The NSA does not want their skills listed in open state or federal court.
The next 100 or 200 phones on federal or state AG lists might have some individuals with the ability to hire real legal teams. Questions about evidence and finding their own experts to cross examine in open court are never good new for decades of hidden technical methods
Re:has nobody thought (Score:5, Informative)
Re: (Score:2)
Obviously you haven't follow that case very carefully. The iPhone isn't locked using fingerprints, it uses a 4 digit password.
So what's the evidence for that?