Symantec CEO: Source Code Reviews Pose Unacceptable Risk (reuters.com) 172
In an exclusive report from Reuters, Symantec's CEO says it is no longer allowing governments to review the source code of its software because of fears the agreements would compromise the security of its products. From the report: Tech companies have been under increasing pressure to allow the Russian government to examine source code, the closely guarded inner workings of software, in exchange for approvals to sell products in Russia. Symantec's decision highlights a growing tension for U.S. technology companies that must weigh their role as protectors of U.S. cybersecurity as they pursue business with some of Washington's adversaries, including Russia and China, according to security experts. While Symantec once allowed the reviews, Clark said that he now sees the security threats as too great. At a time of increased nation-state hacking, Symantec concluded the risk of losing customer confidence by allowing reviews was not worth the business the company could win, he said.
Two Choices (Score:5, Insightful)
Either let nobody review the code, or let everybody in the world who wants to look at it review it. I rather suspect that crowdsourcing security reviews might actually make all code safer and more secure, if only because there WILL be friendly eyes going through it and proofreading the code.
Don't Waste Your Money (Score:5, Interesting)
Step 1: US Company, Equifax allows personal ID data for 100's of millions of people to be stolen and nobody seems to care.
Step 2: US Government condemns Kaspersky Labs for potentially leaking information to the Russians. Thus destroying Kaspersky's US market.
Step 3: Symantec prohibits government source code reviews. Thus insuring an NSA backdoor.
So, no matter what you do, you are screwed.
There is clearly no such thing as Cyber Security.
Put your money on Molson beer.
It is a much better investment.
Re: (Score:1)
Step 3: Symantec prohibits government source code reviews. Thus insuring an NSA backdoor.
There's the problem right there.
An insurance company selling software.
Which ensures there is an NSA backdoor.
Re:Don't Waste Your Money (Score:5, Insightful)
usa government (and its cronies) logic :
kaspersky software finds (as it is supposed to) nsa's new malware in a nsa contractor's private computer. alerts hq, russian gov perhaps hears about it. kaspersky is a security threat.
meanwhile symantec never finds any nsa malware. symantec wont let others examine its source. symantec is patriotic!
Re: Don't Waste Your Money (Score:5, Insightful)
how come some low id Slashdot accounts are pushing the Kremlin line recently? Is it anything to do with a comment on the 20 year anniversary story saying these accounts are worth money or has the Slashdot database been hacked?
may be "low id slashdot accounts" prefer openness, individual freedom, and critical thinking, over secrecy, "security"(as defined by deep state), and propaganda.
Re: (Score:2)
how come some low id Slashdot accounts are pushing the Kremlin line recently? Is it anything to do with a comment on the 20 year anniversary story saying these accounts are worth money or has the Slashdot database been hacked?
may be "low id slashdot accounts" prefer openness, individual freedom, and critical thinking, over secrecy, "security"(as defined by deep state), and propaganda.
If you prefer openness, individual freedom, and critical thinking, then Russia is not where you should be looking.
In fact, "secrecy, "security"(as defined by deep state), and propaganda" is at least as characteristic of Russia than of pretty much all Western countries. (I'm saying this as a non-Russian who speaks Russian fluently and has spent the last ten years working in 11 out of 15 CIS states.) Russia is also at least as much driven by capitalism and corporate greed and has greater social inequality. If
Re: (Score:2)
Re: (Score:1)
...everything popular media and government agencies tell you about Russia this and Russia that is true and not propaganda and fearmongering—or sometimes a complete nothingburger, even if true to a large extent, like with the elections. Just the fact that they could do this or that bad thing (and I can say that pretty much everything is possible with the current gang in power here), doesn't mean they actually have, unless you can show some credible proof and explanation.
I believe the proof you're seeking is at least partially provided by Facebook and Google's ad sales. Russian sources spent significant funds to direct ads to attempt to influence voters in specific battleground states. That's a pretty significant smoking gun, given all the other circumstantial evidence already reported regarding Russian activities.
Re: (Score:2)
Re: (Score:2)
Proof of what?
If you want to claim that it's proof that Russia interfered with the US elections, you've got a point. If you want to claim it's proof that Facebook acted above the law and isn't being punished, you've got a point. If you want to claim that Facebook are unpatriotic villians, you've got a point.
If you want to claim it as proof of something else, you need to connect the dots. There's no obvious connection to Kapersky. (I speculate about connections, but different ones than you appear to be s
Re: (Score:2)
It's weird how the Republicans and Democrats have swapped sides over this.
E.g. back in 2012 Mitt Romney said that Russia was the biggest threat to the US and the Democrats mocked him for still living in the Cold War era.
Then in 2013 Snowden fled to Russia and probably to a job with Russian intelligence. The consensus on slashdot was that was fine and he needed to get away from the NSA so he could continue to leak. Even though Putin made it clear that he could not continue to leak, at least not publicly
https [cbsnews.com]
Re: (Score:3)
Re: (Score:2)
Re: (Score:1)
The young definitely seem to be on the socialist path. That's not all bad, nor all good. Unfettered capitalism has led to some of the largest swindles ever seen. Even medieval kings didn't successfully gather as much power as some capitalists, because their economies were directly tied to their power. Capitalists have no such ties, destroying a country's economy has no negative bearing on their wealth if they can plunder the target country's wealth.
Given technological progress, adoption of some ideas of s
Re: (Score:2)
I like to think its more of an experience thing. IE - we've seen so many of these conspiracy theories that the current Russia fever seems way out of touch.
Somehow low effort trolling and phishing campaigns are somehow becoming hacks. And they are massively diverting money and attention away from the real crimes. Podesta gave his password to a foreign agent. That affects a single person. Someone stole VA/OPM/Equifax data on millions of Americans. Yet we are see nothing happening, going to happen, or an
Re: (Score:2)
how come some low id Slashdot accounts are pushing the Kremlin line recently?
I think you are confusing "pushing the Kremlin line" with "not believing everything that the NSA feeds us".
Those who have been around a long time know that Kaspersky has been a top notch computer security company for a long time. We also know that the NSA has been trying to hack everyone for a long time.
Of course we know that the Kremlin is trying to hack everyone too, but we are not going to quickly abandon our years of experience.
Re: (Score:2)
I wouldn't call 88000 low.
Re: (Score:2)
How come that with a mishmash of hearsay and "expert" opinion people pretend to know documents were stolen?
If his computer held classified information, no one outside the government is ever going to touch it again. If they are investigating an adversary's cyberattack, the government is never going to publicly disclose its methods, tools, or findings. That is all going to be classified too.
I seriously doubt anyone with first-hand knowledge will be talking about it; this isn't the kind of issue where an attack of conscience will lead someone to cross the line.
The public will never be told the details, so the close
Re: (Score:2)
Step 1: US Company, Equifax allows personal ID data for 100's of millions of people to be stolen and nobody seems to care.
Nobody seems to care? It's been all over the news, in every form of media; it's been a huge deal.
They're under criminal investigation by the DoJ for possible insider trading, and there's a criminal investigation into the hack itself by the FBI.
Those things just take time. Protesting and rioting in the streets won't help anything, and that's getting a bit overused lately anyway. And, this may finally spell the end of using SSNs for all kinds of identification purposes, which would be a significant step in the right direction.
Re: (Score:2)
Or simply drop windows and all their closed source software with unknown number of security problems and backdoors and use open source OS (linux, *bsd, MenuetOS, TempleOS , whatever) and software
Many people do not agree with RMS (Richard Stallman), but times proves that he is still right
Re:Two Choices (Score:5, Informative)
Just say no to Symantec, it can only make your system worse (they had a solid C compiler back in the 90s though).
Re: (Score:1)
No, like everything else Symantec, that compiler was acquired, not developed by them. Again, like everything Symantec, they then proceeded to run it into the ground.
Re: (Score:2)
like everything Symantec, they then proceeded to run it into the ground.
On the upside, they would have been able to run it really fast.
Re: (Score:3)
Just say no to Symantec, it can only make your system worse
Sadly, being "good" and "effective" are seldom requirements for 'checkbox compliance'. I went through this with a law firm recently that was trying to upgrade everything to meet the requirements the bank had in order to do business with them. The bank didn't explicitly say they required Symantec, but a whole lot of their workstation requirements were conveniently default (or basically-default) policies Symantec has. Being a fan of ESET due to it being actually-effective, I pitched it to the client. The clie
Re: (Score:2)
Re: (Score:3)
We'll be safer against the COMMON bad actor that just finds a simple bug that STANDARD REVIEW would detect.
And less safe against bad actors that have highly-advanced specialized technical knowledge to find subtle bugs that everyone else is going to miss (Although these highly-advanced technical actors with a lot of money to spend could likely be able to reverse-engineer the entire product in their search for potential bugs).
Re: (Score:1)
Damn. I get most of my news on the internet from AC First Posts on slashdot.
The end of the article is laughable (Score:5, Insightful)
“As a vendor here in the United States,” Clark said, “we are headquartered in a country where it is OK to say no.”
Yeah right and national security letters are a figment of my imagination...
Re:The end of the article is laughable (Score:5, Funny)
Re: (Score:1)
Re: (Score:2)
To be fair the NSA has oversight from FISA courts.
Dear Leader Putin, on the other hand, does whatever the fuck he likes. And if you don't like it, you can have a nice cup of polonium tea.
Right... (Score:1)
Highly likely their software is shit and it's shit all the way down and they don't want you to know how shit it is.
Re: (Score:1)
Highly likely their software is shit
Highly likely? Installing symantec is like giving your computer AIDS. That has always been a general rule in IT.
Wrong (Score:5, Insightful)
It is unreviewed proprietary source code is what poses the most significant risk. Any government technology department that fails to do a source code review of a product before deployment is committing malpractice. If a vendor refuses to cooperate their product should be barred from competition.
Re: (Score:2)
Not wrong.
There is potential for a security leak because all software is notorious for the fact that as more people work on code the more bugs are potentially included. There is legitimate concern of governments including the US of abusing this. It is far better to allow individual independent specific companies on behalf of countries to review code.
Far easier to vet a company and monitor it than a country.
Says volumes (Score:5, Insightful)
about how much he believes in the security of his own software.
The best stuff is that which can stand up to peer review and intense scrutiny, yet retain its trust level.
Given a choice between a closed source super-secret-trust-us-its-secure platform or an open source peer-reviewed-I-dare-you-to-break-it one, guess which one I would prefer to go with ?
Re: False dychotomy (Score:2)
Re: (Score:2)
That idiot in the White House can't be controlled. He's an equal-opportunity loose 16"/50 cannon, except for his hate for Obama and those who don't worship him. Other people have made that mistake. It's similar to what the German right wing thought about Hitler in 1933, except that Hitler had ideas and was generally quite competent (unfortunately).
Re:Says volumes (Score:4, Interesting)
Says volumes about how much he believes in the security of his own software.
I worked on secure systems before. It was common to use well documented algorithms for encryption. The mathematics showed the encryption to be secure. The implementation would be trivial rewrites of the encryption, so not any different than anything open source. We'd pair the encryption we had with open source implementations to assure we did it correctly.
One thing we could not do was reveal our code. In fact even mentioning which encryption we used was considered a security violation. This was done to deny an attacker as much information as possible for an attack. Sure, the code was likely very secure, but we weren't under any kind of obligation to give attackers anything that could make their life of snooping into the communications easier.
There is still a possibility that someone might be able to prove the encryption we used was not as secure as previously believed. We'd still enjoy security by obscurity. The assumption was that if the encryption was flawed then attackers would still have to go through the effort to find out if we used the flawed encryption or not. This buys time to fix the problem.
Most encryption is based on the idea of creating a key with enough bits that any brute force attack would have to try all the combinations to break. By keeping the algorithm a secret then we have effectively added a few more bits to the key. That adds that much more time to an attack.
Then there is the matter of intellectual property and industrial espionage. By sharing the code with the government there is a possibility of something unique and valuable being revealed to a potential competitor to copy and sell, or possibly patent and claim infringement on the original authors. Maybe the rights to the code would hold up in court but that still means the expense of going to court.
Re:Says volumes (Score:5, Insightful)
By keeping the algorithm a secret then we have effectively added a few more bits to the key.
You didn't, really.
If the attacker has your binary, decompiling it is not hard. I don't even have to decompile it in most cases, merely watching how the pattern of memory accesses is generally enough to identify at least the class of algorithm used (there aren't that many), and examination of S boxes etc., tells the rest. And if the algorithm you used is remotely close to breakable -- by brute force or any other means -- then you're hosed.
Obscurity is very foolish except in one case: security hardware which has internal storage, and can't practically be updated. A good example is a smart card chip. In that case, all you can do is do the best job you can on the software, and the best job you can do on the hardware (whose job is partly to deny the attacker access to your software), and then keep it secret. Assuming the hardware doesn't leak it, and you don't leak it, then the attacker can only blindly fuzz the device to look for vulnerabilities.
In practice, though, smart card makers don't do that either. They do provide full details of hardware and software, including source code, to a couple of highly-capable test labs, who spend many months poring through all of it as well as fuzzing it, attempting physical penetration of the hardware and everything else they can think of.
If your organization did that, hired multiple outside teams of extremely talented people to attack your implementation, and you kept the attacker away from the binary as well as the source, then perhaps you gained something from the obscurity. If not, you just fooled yourselves, and made your product weaker than it would have been if you had published the design and the source code for the world to beat on.
Re: (Score:2)
The systems I was referring to did in fact have sealed boxes where if tampering was detected the memory was wiped. The communications the devices were meant to protect would still be down an unsecured wire or transmitted by radio. If the encryption used was known then that means much less resources would be needed to break it, brute force or otherwise.
More generally though by keeping code secret, even on publicly available software, you'd be forcing a state funded actor to put the resources to decompiling
Re: (Score:2)
The systems I was referring to did in fact have sealed boxes where if tampering was detected the memory was wiped.
Then the obscurity wasn't *completely* pointless. But still mostly pointless. If your algorithm is broken, "adding a few bits" is extremely unlikely to make any difference. If it's not broken, then adding a few bits makes no difference... and by keeping it secret you're running the risk that you have serious flaws that you don't know about, which could completely destroy the security. Bad idea.
More generally though by keeping code secret, even on publicly available software, you'd be forcing a state funded actor to put the resources to decompiling the code.
Which takes seconds.
The code would then have to be examined for vulnerabilities. By handing over the original code there's much more information to work with.
A little, sure. Enough to matter? Not at all. I work with a lot of people who do reverse engine
Re: (Score:2)
Also, I note that you did not confirm that your organization had outside penetration testing done. That right there proves that your organization doesn't know how to write secure software. Please tell me what hardware we're talking about so I can avoid it.
I can assure you that the organization I worked for does know how to write secure software. You cannot buy these devices as they were built for a specific use and even saying the name of the project might be a security violation. I was not knowledgeable of all the design and testing involved because everything was need to know. I had general ideas on their ultimate use, the cases they were put in, the kind of wire or radio used, and so on. I know that there were tamper switches on the cases because thos
Re: (Score:2)
I can assure you that the organization I worked for does know how to write secure software.
Maybe, but I'm skeptical.
The point is, and you admit to it, that not having the source code will slow down an attack.
Trivially.
We can debate how much but knowing it will slow down an attack is sufficient to go through the effort of keeping certain design choices secret.
No. That is not enough to justify reliance on secrecy. There has to be some other reason, or it's just self-deception.
Re: (Score:2)
Memory accesses? (Score:2)
" merely watching how the pattern of memory accesses is generally enough to identify at least the class of algorithm used "
Oh come on, you think nobody has thought of that and doesn't game the algorithm to make a load of pointless and unnecessary memory accesses in order to fool anyone with a logic analyser sitting on the bus? These days the speed hit doing so is almost irrelevant.
Re: (Score:2)
Oh come on, you think nobody has thought of that and doesn't game the algorithm to make a load of pointless and unnecessary memory accesses in order to fool anyone with a logic analyser sitting on the bus? These days the speed hit doing so is almost irrelevant.
The speed hit is not irrelevant. Performance is still a significant challenge for many applications, even on desktop and server class machines -- and definitely for mobile and embedded.
And it wouldn't matter much anyway. Worst case (and very unlikely) it might make the attacker have to bother with decompiling.
Re: (Score:2)
You can't decompile the binary if you can't get to it - if its encrypted in firmware and taking the lid off the chip destroys it.
Re: (Score:2)
You can't decompile the binary if you can't get to it - if its encrypted in firmware and taking the lid off the chip destroys it.
In that case, and if it's not feasible to patch it, and if you've really done your due diligence (pen testing), then secrecy might make sense. But, really, it's the "not feasible to patch it" that is the reason for and justification of secrecy, not the rest.
Nit: if the firmware is inaccessible, there's no reason to encrypt it. Unless you have another even more secure component inside which holds the decryption key? And there's no point in that unless that other more secure component not only decrypts the
Re: (Score:2)
The Russian government tries to break into US companies all the time.
Symantec protects many US companies.
Letting them read the code for the software that protects their targets might not be a good idea.
Oh, Really? (Score:5, Insightful)
I've published the source code of my own products since about 1987. The difference between Symantec and me is that I give the source code to everyone, and I give them an incentive to read the code, because they can also redistribute and modify it, and put it to any use.
And of course a national entity that wants to enough, like the government of Russia, is going to get a look at the Symantec source code even if it means getting someone into a job there to do it. So, isn't Symantec just saying that their proprietary paradigm is a poor one from a security perspective?
Re:Oh, Really? (Score:4, Insightful)
A lot of developers of proprietary software still use open source tools. Both git and SVN are among the most popular version control systems and very little collaborative work could occur on the levels required today without tools like that. That developers can freely use and improve those tools just means that money can be spent elsewhere. How many billions would need to be spent if FOSS like that didn't exist?
Re:Oh, Really? (Score:5, Funny)
I think the most significant thing about this story is that Bruce Perens still visits /.
Err hi Bruce!
Re:Oh, Really? (Score:5, Funny)
Re: (Score:2)
He comes out of the woodwork pretty regularly for security related articles. :)
Re: Oh, Really? (Score:2)
Re: (Score:2)
I think any company who shows their source code to russian intelligence for any reason needs their fucking head examined.
We're talking the very definition of 'bad faith actor' here. Get a fucking mind. Get your business elsewhere.
Re: (Score:2)
While I agree with you philosophically, I think in terms of an AV program on Windows you're dealing with a unique set of vulnerabilities and a black hat state organization would want to know every detection technique and evasion detection trick they could. It's kind of a fundamentally insecure environment to begin with.
Good and bad nations? (Score:5, Insightful)
USA, UK, NZ, AU, Canada?
Some of the more trusted NATO nations? All of NATO? Nations wishing to join NATO soon?
Some other nations? A China? Brazil? Japan?
Why would any nation buy into a security product they have not seen all the code to?
Other developers will just offer their products for review. How long before nations just say no review, no buy?
Kaspersky Fallout (Score:3)
I imagine the backlash against Kaspersky, after it was found the Russian govt. was abusing security holes in its anti-virus software in order to hack computers which had it installed, is responsible for this. It seems plausible they found out about said holes due to the mandatory source-code reviews.
Re: (Score:2)
If I were using my imagination here, I'd imagine that the unacceptable risk is that someone will figure out that they're distributing malware on behalf of their host state.
Breaking News: Water is Wet (Score:1)
Fair enough (Score:2)
If I was a government reviewing a security product like that, I wouldn't tell them about any vulnerabilities I find. They would be much more useful to use against all of their customers.
Security through obscurity (Score:4, Informative)
"In security engineering, security through obscurity [wikipedia.org] (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system. A system or component relying on obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that if the flaws are not known, that will be sufficient to prevent a successful attack. Security experts have rejected this view as far back as 1851, and advise that obscurity should never be the only security mechanism."
So either the CEO of Symantec is a security idiot, or he has a better reason he's not sharing.
And if he's claiming the reason for using Security Through Obscurity is to provide his customers with a stronger feeling of being secure, I do hope the masses aren't idiots and this backfires as spectacularly as it really should.
Re: (Score:1)
You're creating a Steadman. The software is adequately complex to assume that there are vulnerabilities. Giving access to a hostile actor accelerates the discovery. Without aerospace grade engineering and prices, you can't create software without defects. It's still remarkably hard with the 20:+ price differential for DO-178 style processes. To claim that security through obscurity is bad is a deliberate deception by presuming t that its the only security measure. Its a helpful layer.
So is this a (Score:2)
Reverse Kaspersky from Russia with love?
Outsource the development (Score:4, Funny)
to a third world nation.
Then anyone can review it and probably won't be able to make any sense of it whatsoever. Unless they are fluent in spaghetti code. It's like a cheaper type of encryption.
Real risk is discovering the backdoors (Score:3, Interesting)
they put in for NSA.
You guys all misunderstood what they feared about. They are not afraid of foreign governments finding flaws in their software, they are afraid of foreign governments finding the NSA backdoors, and thus banning Symantec in their country. With the USA's example of banning Kaspersky, Symantec didn't even have any grounds to complain.
Cost benefit (Score:1)
Re: (Score:2)
Backwards (Score:2)
Symantec don't allow governments access to source (Score:1)
It wouldn't surprise me if the state security apparatus didn't already have access to Symantec code through their embedded agents.
Just another reason (Score:2)
Not to buy symantic.
Security through obscurity (Score:1)
Critical Code Reviews (Score:2)
Critical Code Reviews lead to better code. Perform those thoroughly in house and you should pass any review with flying colors.
Usually the "Critical" bit together with preposterous egos is usually the problem within most organisations. Nobody dares to tell the guru he's wrong. And no manager is ever rewarded for solving difficult problems, unless they can't be circumvented with loads of babble.
I know.
"risk of losing customer confidence" (Score:2)
Reviewing source code means nothing (Score:2)
How can anyone prove that the source code they are reviewing is the actual product being used? What government has that kind of resources anyway?
More reason to be careful with Symantec Products (Score:2)
There must be balance between security by obscurity and complete openness.
Antivirus software is a huge exception . . . (Score:2)
I want to make clear, for the majority of software I am strongly of the opinion that perfect knowledge of the source code should not allow an attacker any advantage because the security properties are invariant to the implementation. For a trivial example, you can review the libOTR or TrueCrypt code all day, but the confidentiality of my encrypted volumes rests on the underlying cryptographic ciphers and my ability to keep the password a secret.
But I actually agree with Symantec that AV is a unique exceptio