Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Software Government Privacy Security United States Technology

Symantec CEO: Source Code Reviews Pose Unacceptable Risk (reuters.com) 172

In an exclusive report from Reuters, Symantec's CEO says it is no longer allowing governments to review the source code of its software because of fears the agreements would compromise the security of its products. From the report: Tech companies have been under increasing pressure to allow the Russian government to examine source code, the closely guarded inner workings of software, in exchange for approvals to sell products in Russia. Symantec's decision highlights a growing tension for U.S. technology companies that must weigh their role as protectors of U.S. cybersecurity as they pursue business with some of Washington's adversaries, including Russia and China, according to security experts. While Symantec once allowed the reviews, Clark said that he now sees the security threats as too great. At a time of increased nation-state hacking, Symantec concluded the risk of losing customer confidence by allowing reviews was not worth the business the company could win, he said.
This discussion has been archived. No new comments can be posted.

Symantec CEO: Source Code Reviews Pose Unacceptable Risk

Comments Filter:
  • Two Choices (Score:5, Insightful)

    by sehlat ( 180760 ) on Tuesday October 10, 2017 @09:35PM (#55346879)

    Either let nobody review the code, or let everybody in the world who wants to look at it review it. I rather suspect that crowdsourcing security reviews might actually make all code safer and more secure, if only because there WILL be friendly eyes going through it and proofreading the code.

    • by Anonymous Coward on Tuesday October 10, 2017 @10:15PM (#55347053)

      Step 1: US Company, Equifax allows personal ID data for 100's of millions of people to be stolen and nobody seems to care.

      Step 2: US Government condemns Kaspersky Labs for potentially leaking information to the Russians. Thus destroying Kaspersky's US market.

      Step 3: Symantec prohibits government source code reviews. Thus insuring an NSA backdoor.

      So, no matter what you do, you are screwed.
      There is clearly no such thing as Cyber Security.
      Put your money on Molson beer.
      It is a much better investment.

      • by Anonymous Coward

        Step 3: Symantec prohibits government source code reviews. Thus insuring an NSA backdoor.

        There's the problem right there.
        An insurance company selling software.
        Which ensures there is an NSA backdoor.

      • by sittingnut ( 88521 ) <sittingnut@NoSpAM.gmail.com> on Tuesday October 10, 2017 @11:06PM (#55347211) Homepage

        usa government (and its cronies) logic :
        kaspersky software finds (as it is supposed to) nsa's new malware in a nsa contractor's private computer. alerts hq, russian gov perhaps hears about it. kaspersky is a security threat.
        meanwhile symantec never finds any nsa malware. symantec wont let others examine its source. symantec is patriotic!

      • Step 1: US Company, Equifax allows personal ID data for 100's of millions of people to be stolen and nobody seems to care.

        Nobody seems to care? It's been all over the news, in every form of media; it's been a huge deal.
        They're under criminal investigation by the DoJ for possible insider trading, and there's a criminal investigation into the hack itself by the FBI.
        Those things just take time. Protesting and rioting in the streets won't help anything, and that's getting a bit overused lately anyway. And, this may finally spell the end of using SSNs for all kinds of identification purposes, which would be a significant step in the right direction.

      • by higuita ( 129722 )

        Or simply drop windows and all their closed source software with unknown number of security problems and backdoors and use open source OS (linux, *bsd, MenuetOS, TempleOS , whatever) and software

        Many people do not agree with RMS (Richard Stallman), but times proves that he is still right

    • Re:Two Choices (Score:5, Informative)

      by phantomfive ( 622387 ) on Wednesday October 11, 2017 @12:18AM (#55347445) Journal
      Good thing Symantec is secure and has no horrible remote exploits that give hackers top-level access to the system [computerworld.com].

      Just say no to Symantec, it can only make your system worse (they had a solid C compiler back in the 90s though).
      • by Anonymous Coward

        No, like everything else Symantec, that compiler was acquired, not developed by them. Again, like everything Symantec, they then proceeded to run it into the ground.

        • like everything Symantec, they then proceeded to run it into the ground.

          On the upside, they would have been able to run it really fast.

      • Just say no to Symantec, it can only make your system worse

        Sadly, being "good" and "effective" are seldom requirements for 'checkbox compliance'. I went through this with a law firm recently that was trying to upgrade everything to meet the requirements the bank had in order to do business with them. The bank didn't explicitly say they required Symantec, but a whole lot of their workstation requirements were conveniently default (or basically-default) policies Symantec has. Being a fan of ESET due to it being actually-effective, I pitched it to the client. The clie

    • by mysidia ( 191772 )

      We'll be safer against the COMMON bad actor that just finds a simple bug that STANDARD REVIEW would detect.
      And less safe against bad actors that have highly-advanced specialized technical knowledge to find subtle bugs that everyone else is going to miss (Although these highly-advanced technical actors with a lot of money to spend could likely be able to reverse-engineer the entire product in their search for potential bugs).

  • by Anonymous Coward on Tuesday October 10, 2017 @09:38PM (#55346897)

    “As a vendor here in the United States,” Clark said, “we are headquartered in a country where it is OK to say no.”

    Yeah right and national security letters are a figment of my imagination...

  • by Anonymous Coward

    Highly likely their software is shit and it's shit all the way down and they don't want you to know how shit it is.

    • Highly likely their software is shit

      Highly likely? Installing symantec is like giving your computer AIDS. That has always been a general rule in IT.

  • Wrong (Score:5, Insightful)

    by Anonymous Coward on Tuesday October 10, 2017 @09:39PM (#55346901)

    It is unreviewed proprietary source code is what poses the most significant risk. Any government technology department that fails to do a source code review of a product before deployment is committing malpractice. If a vendor refuses to cooperate their product should be barred from competition.

    • Not wrong.

      There is potential for a security leak because all software is notorious for the fact that as more people work on code the more bugs are potentially included. There is legitimate concern of governments including the US of abusing this. It is far better to allow individual independent specific companies on behalf of countries to review code.

      Far easier to vet a company and monitor it than a country.

  • Says volumes (Score:5, Insightful)

    by nehumanuscrede ( 624750 ) on Tuesday October 10, 2017 @09:39PM (#55346907)

    about how much he believes in the security of his own software.

    The best stuff is that which can stand up to peer review and intense scrutiny, yet retain its trust level.

    Given a choice between a closed source super-secret-trust-us-its-secure platform or an open source peer-reviewed-I-dare-you-to-break-it one, guess which one I would prefer to go with ?

    • Re:Says volumes (Score:4, Interesting)

      by blindseer ( 891256 ) <blindseer.earthlink@net> on Tuesday October 10, 2017 @10:28PM (#55347105)

      Says volumes about how much he believes in the security of his own software.

      I worked on secure systems before. It was common to use well documented algorithms for encryption. The mathematics showed the encryption to be secure. The implementation would be trivial rewrites of the encryption, so not any different than anything open source. We'd pair the encryption we had with open source implementations to assure we did it correctly.

      One thing we could not do was reveal our code. In fact even mentioning which encryption we used was considered a security violation. This was done to deny an attacker as much information as possible for an attack. Sure, the code was likely very secure, but we weren't under any kind of obligation to give attackers anything that could make their life of snooping into the communications easier.

      There is still a possibility that someone might be able to prove the encryption we used was not as secure as previously believed. We'd still enjoy security by obscurity. The assumption was that if the encryption was flawed then attackers would still have to go through the effort to find out if we used the flawed encryption or not. This buys time to fix the problem.

      Most encryption is based on the idea of creating a key with enough bits that any brute force attack would have to try all the combinations to break. By keeping the algorithm a secret then we have effectively added a few more bits to the key. That adds that much more time to an attack.

      Then there is the matter of intellectual property and industrial espionage. By sharing the code with the government there is a possibility of something unique and valuable being revealed to a potential competitor to copy and sell, or possibly patent and claim infringement on the original authors. Maybe the rights to the code would hold up in court but that still means the expense of going to court.

      • Re:Says volumes (Score:5, Insightful)

        by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday October 11, 2017 @12:20AM (#55347449) Journal

        By keeping the algorithm a secret then we have effectively added a few more bits to the key.

        You didn't, really.

        If the attacker has your binary, decompiling it is not hard. I don't even have to decompile it in most cases, merely watching how the pattern of memory accesses is generally enough to identify at least the class of algorithm used (there aren't that many), and examination of S boxes etc., tells the rest. And if the algorithm you used is remotely close to breakable -- by brute force or any other means -- then you're hosed.

        Obscurity is very foolish except in one case: security hardware which has internal storage, and can't practically be updated. A good example is a smart card chip. In that case, all you can do is do the best job you can on the software, and the best job you can do on the hardware (whose job is partly to deny the attacker access to your software), and then keep it secret. Assuming the hardware doesn't leak it, and you don't leak it, then the attacker can only blindly fuzz the device to look for vulnerabilities.

        In practice, though, smart card makers don't do that either. They do provide full details of hardware and software, including source code, to a couple of highly-capable test labs, who spend many months poring through all of it as well as fuzzing it, attempting physical penetration of the hardware and everything else they can think of.

        If your organization did that, hired multiple outside teams of extremely talented people to attack your implementation, and you kept the attacker away from the binary as well as the source, then perhaps you gained something from the obscurity. If not, you just fooled yourselves, and made your product weaker than it would have been if you had published the design and the source code for the world to beat on.

        • The systems I was referring to did in fact have sealed boxes where if tampering was detected the memory was wiped. The communications the devices were meant to protect would still be down an unsecured wire or transmitted by radio. If the encryption used was known then that means much less resources would be needed to break it, brute force or otherwise.

          More generally though by keeping code secret, even on publicly available software, you'd be forcing a state funded actor to put the resources to decompiling

          • The systems I was referring to did in fact have sealed boxes where if tampering was detected the memory was wiped.

            Then the obscurity wasn't *completely* pointless. But still mostly pointless. If your algorithm is broken, "adding a few bits" is extremely unlikely to make any difference. If it's not broken, then adding a few bits makes no difference... and by keeping it secret you're running the risk that you have serious flaws that you don't know about, which could completely destroy the security. Bad idea.

            More generally though by keeping code secret, even on publicly available software, you'd be forcing a state funded actor to put the resources to decompiling the code.

            Which takes seconds.

            The code would then have to be examined for vulnerabilities. By handing over the original code there's much more information to work with.

            A little, sure. Enough to matter? Not at all. I work with a lot of people who do reverse engine

            • Also, I note that you did not confirm that your organization had outside penetration testing done. That right there proves that your organization doesn't know how to write secure software. Please tell me what hardware we're talking about so I can avoid it.

              I can assure you that the organization I worked for does know how to write secure software. You cannot buy these devices as they were built for a specific use and even saying the name of the project might be a security violation. I was not knowledgeable of all the design and testing involved because everything was need to know. I had general ideas on their ultimate use, the cases they were put in, the kind of wire or radio used, and so on. I know that there were tamper switches on the cases because thos

              • I can assure you that the organization I worked for does know how to write secure software.

                Maybe, but I'm skeptical.

                The point is, and you admit to it, that not having the source code will slow down an attack.

                Trivially.

                We can debate how much but knowing it will slow down an attack is sufficient to go through the effort of keeping certain design choices secret.

                No. That is not enough to justify reliance on secrecy. There has to be some other reason, or it's just self-deception.

        • Comment removed based on user account deletion
        • " merely watching how the pattern of memory accesses is generally enough to identify at least the class of algorithm used "

          Oh come on, you think nobody has thought of that and doesn't game the algorithm to make a load of pointless and unnecessary memory accesses in order to fool anyone with a logic analyser sitting on the bus? These days the speed hit doing so is almost irrelevant.

          • Oh come on, you think nobody has thought of that and doesn't game the algorithm to make a load of pointless and unnecessary memory accesses in order to fool anyone with a logic analyser sitting on the bus? These days the speed hit doing so is almost irrelevant.

            The speed hit is not irrelevant. Performance is still a significant challenge for many applications, even on desktop and server class machines -- and definitely for mobile and embedded.

            And it wouldn't matter much anyway. Worst case (and very unlikely) it might make the attacker have to bother with decompiling.

            • by Viol8 ( 599362 )

              You can't decompile the binary if you can't get to it - if its encrypted in firmware and taking the lid off the chip destroys it.

              • You can't decompile the binary if you can't get to it - if its encrypted in firmware and taking the lid off the chip destroys it.

                In that case, and if it's not feasible to patch it, and if you've really done your due diligence (pen testing), then secrecy might make sense. But, really, it's the "not feasible to patch it" that is the reason for and justification of secrecy, not the rest.

                Nit: if the firmware is inaccessible, there's no reason to encrypt it. Unless you have another even more secure component inside which holds the decryption key? And there's no point in that unless that other more secure component not only decrypts the

    • How about this:

      The Russian government tries to break into US companies all the time.

      Symantec protects many US companies.

      Letting them read the code for the software that protects their targets might not be a good idea.

  • Oh, Really? (Score:5, Insightful)

    by Bruce Perens ( 3872 ) <bruce@perens.com> on Tuesday October 10, 2017 @09:39PM (#55346909) Homepage Journal

    I've published the source code of my own products since about 1987. The difference between Symantec and me is that I give the source code to everyone, and I give them an incentive to read the code, because they can also redistribute and modify it, and put it to any use.

    And of course a national entity that wants to enough, like the government of Russia, is going to get a look at the Symantec source code even if it means getting someone into a job there to do it. So, isn't Symantec just saying that their proprietary paradigm is a poor one from a security perspective?

    • by Dantoo ( 176555 ) on Tuesday October 10, 2017 @09:54PM (#55346979)

      I think the most significant thing about this story is that Bruce Perens still visits /.

      Err hi Bruce!

    • by swb ( 14022 )

      While I agree with you philosophically, I think in terms of an AV program on Windows you're dealing with a unique set of vulnerabilities and a black hat state organization would want to know every detection technique and evasion detection trick they could. It's kind of a fundamentally insecure environment to begin with.

  • by AHuxley ( 892839 ) on Tuesday October 10, 2017 @09:50PM (#55346959) Journal
    Who gets a review?
    USA, UK, NZ, AU, Canada?
    Some of the more trusted NATO nations? All of NATO? Nations wishing to join NATO soon?
    Some other nations? A China? Brazil? Japan?
    Why would any nation buy into a security product they have not seen all the code to?
    Other developers will just offer their products for review. How long before nations just say no review, no buy?
  • by mentil ( 1748130 ) on Tuesday October 10, 2017 @09:53PM (#55346973)

    I imagine the backlash against Kaspersky, after it was found the Russian govt. was abusing security holes in its anti-virus software in order to hack computers which had it installed, is responsible for this. It seems plausible they found out about said holes due to the mandatory source-code reviews.

    • If I were using my imagination here, I'd imagine that the unacceptable risk is that someone will figure out that they're distributing malware on behalf of their host state.

  • The real news here is most Symantec customers will be shocked when they find out they were allowing foreign governments code reviews in the first place.
  • If I was a government reviewing a security product like that, I wouldn't tell them about any vulnerabilities I find. They would be much more useful to use against all of their customers.

  • by v1 ( 525388 ) on Tuesday October 10, 2017 @10:18PM (#55347069) Homepage Journal

    "In security engineering, security through obscurity [wikipedia.org] (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system. A system or component relying on obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that if the flaws are not known, that will be sufficient to prevent a successful attack. Security experts have rejected this view as far back as 1851, and advise that obscurity should never be the only security mechanism."

    So either the CEO of Symantec is a security idiot, or he has a better reason he's not sharing.

    And if he's claiming the reason for using Security Through Obscurity is to provide his customers with a stronger feeling of being secure, I do hope the masses aren't idiots and this backfires as spectacularly as it really should.

    • by Anonymous Coward

      You're creating a Steadman. The software is adequately complex to assume that there are vulnerabilities. Giving access to a hostile actor accelerates the discovery. Without aerospace grade engineering and prices, you can't create software without defects. It's still remarkably hard with the 20:+ price differential for DO-178 style processes. To claim that security through obscurity is bad is a deliberate deception by presuming t that its the only security measure. Its a helpful layer.

  • Reverse Kaspersky from Russia with love?

  • by Kellamity ( 4286097 ) on Tuesday October 10, 2017 @10:24PM (#55347087)

    to a third world nation.

    Then anyone can review it and probably won't be able to make any sense of it whatsoever. Unless they are fluent in spaghetti code. It's like a cheaper type of encryption.

  • by Anonymous Coward on Tuesday October 10, 2017 @10:34PM (#55347123)

    they put in for NSA.

    You guys all misunderstood what they feared about. They are not afraid of foreign governments finding flaws in their software, they are afraid of foreign governments finding the NSA backdoors, and thus banning Symantec in their country. With the USA's example of banning Kaspersky, Symantec didn't even have any grounds to complain.

  • Why share source when fair chance could be leaked to hackers and / or competition with no business case. Open source might be ideal but many Companies make more money and potentially can make better products investing in development, support etc.. The CEO indicated there was not a good business case to share. His judgment but seems rational.
  • *NOT* allowing source codes reviews poses unacceptable risk. I guess I STILL won't be using Symantec products.
  • "Symantec's CEO says it is no longer allowing governments to review the source code of its software because of fears the agreements would compromise the security of its products."

    It wouldn't surprise me if the state security apparatus didn't already have access to Symantec code through their embedded agents.
  • Not to buy symantic.

  • Not surprisingly, their products are among the least reliable on the market. Those who make such claims should not even work on the safety of a closet. Surely this is an additional reason not to buy their software and not to recommend it to customers.
  • Critical Code Reviews lead to better code. Perform those thoroughly in house and you should pass any review with flying colors.

    Usually the "Critical" bit together with preposterous egos is usually the problem within most organisations. Nobody dares to tell the guru he's wrong. And no manager is ever rewarded for solving difficult problems, unless they can't be circumvented with loads of babble.

    I know.

  • Call their customer service sometime and you will lose all confidence.
  • How can anyone prove that the source code they are reviewing is the actual product being used? What government has that kind of resources anyway?

  • Imagine a state where a drug company said that it would refuse to allow government health organizations to examine all aspects of their products before approving of their sale.

    There must be balance between security by obscurity and complete openness.
  • I want to make clear, for the majority of software I am strongly of the opinion that perfect knowledge of the source code should not allow an attacker any advantage because the security properties are invariant to the implementation. For a trivial example, you can review the libOTR or TrueCrypt code all day, but the confidentiality of my encrypted volumes rests on the underlying cryptographic ciphers and my ability to keep the password a secret.

    But I actually agree with Symantec that AV is a unique exceptio

BLISS is ignorance.

Working...