Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Government Privacy Security United States

US Intelligence Agencies Tried To Bribe Our Developers To Weaken Encryption, Says Telegram Founder (twitter.com) 135

In a series of tweets, Pavel Durov, the Russian founder of the popular secure messaging app Telegram has revealed that U.S. intelligence agencies tried twice to bribe his company's developers to weaken encryption in the app. The incident, Durov said, happened last year during the team's visit to the United States. "During our team's 1-week visit to the US last year we had two attempts to bribe our devs by US agencies + pressure on me from the FBI," he said. "And that was just 1 week. It would be naive to think you can run an independent/secure cryptoapp based in the US."

Telegram is one of the most secure messaging apps available today, though researchers have pointed flaws in it as well.
This discussion has been archived. No new comments can be posted.

US Intelligence Agencies Tried To Bribe Our Developers To Weaken Encryption, Says Telegram Founder

Comments Filter:
  • Don't trust US (Score:5, Informative)

    by qbast ( 1265706 ) on Thursday June 15, 2017 @11:03AM (#54626109)
    Keep that in mind. If you are using VPN/encryption tool/secure communication network/etc. created by US based company, it is very unlikely that it is actually secure.
    • Re:Don't trust US (Score:5, Insightful)

      by Anonymous Coward on Thursday June 15, 2017 @11:29AM (#54626301)

      Before PGP was released there were encryption standards where a company could have encryption that couldn't be broken by a person or another company but it had to be weak enough that the NSA, CIA, etc... could break into it. When PGP was released it made it where companies went against this and could make encryption as strong as they wanted to. A side note they tried to prosecute the creator of PGP for violating the Arms Export Act but were unable to since he put the code online for free and never sold it.

      The thing we are seeing now is the government is either trying to scare companies into giving them the information or bribing the developers into making the encryption weaker.

      • by Anonymous Coward

        Actually, putting the source code online would have violated US law. Selling it was just fine so long as it didn't cross borders as computer code, which is why the complete source code was published, with OCR friendly formatting, see the introduction at http://www.mit.edu/~prz/EN/essays/BookPreface.html.

    • by Cajun Hell ( 725246 ) on Thursday June 15, 2017 @11:29AM (#54626305) Homepage Journal

      It's not really about the US; the US government's behavior is merely helping to illustrate the deeper errors made by the users.

      If you are using VPN/encryption tool/secure communication network/etc. created by US based company, it is very unlikely that it is actually secure.

      More generally:

      If you are using an app created by a company, which is only compatible with itself rather than complying with a public spec, it is very unlikely that it is secure. (It's also pretty unlikely that it won't suck in other ways too.)

      Stop talking about apps, and start talking about protocols. Answer the "which of these apps works best for me?" question later, after protocol selection. If telegram doesn't work with anything else except telegram, then you can be pretty sure that telegram is the wrong choice.

      • by qbast ( 1265706 ) on Thursday June 15, 2017 @11:36AM (#54626377)
        No, it is in fact very much about US and willingness of the government to strong-arm developers into crippling their crypto solutions. All the compliance with public spec won't help you when developers have been pressured into introducing a subtle bug that allows to make encryption easier to break or to modify their official binary builds by adding a piece of code that will store your private key somewhere.
      • by MobyDisk ( 75490 ) on Thursday June 15, 2017 @11:57AM (#54626573) Homepage

        Stop talking about apps, and start talking about protocols.

        This is the problem with computing and the internet over the last 10 years. We switched from developers saying "I want to create a protocol that does X, and I'll make the first app that implements it" to developers saying "I want to sell ads, so I'll make a proprietary app that does X, and refuse to open it up to other developers." It's the pre-1983 IBM -vs- Compaq mentality.

        • yeah i suppose they wanted people to use their software and get paid.

          what bastards!

        • The problem is that open protocols don't bolster the quarterly results. This is why we have so many websites that are best served by HTML5 or something else going with apps. Plus, with the fact that most apps want every permission under the sun, it is another way to slurp data to sell to whomever has the cash, or find another way to throw ads as alerts. If it were not for the fact that we have multiple PC web platforms with different languages for coding in, I wouldn't be surprised to see sites require vi

          • HTML5 is a standard. Complaining about that is like saying "best used with blowfish or SHA512" would be bad.
          • by MobyDisk ( 75490 )

            Open standards have the power to break monopolies and make a lot of people a lot of money. For 25 years, open languages, protocols, and specifications caused the PC industry to grow exponentially. IBM lost their hold, DEC died. Apple maintained their hold, but was a small player until they created the iPod. Why won't the same thing work now? Why can't we have, instead of one FaceBook, dozens of FaceBooks all using the same protocol. In the past, whoever invented the protocol received the first mover b

    • by jellomizer ( 103300 ) on Thursday June 15, 2017 @11:36AM (#54626373)

      Also don't trust, Russia, China, Europe, India, Middle East, East Asia...
      That is why I get all my software from Antarctica, those penguins just don't care about political pressure and they do just what they do.

      However encryption from a bad actor state is still better than no encryption.
       

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Thursday June 15, 2017 @11:37AM (#54626381)
      Comment removed based on user account deletion
    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Is there any actual proof that these accusations are true? Or are we suppose to automatically believe everything a Russian company says just like every statement made by Putin is the truth and nothing but the truth? And since this is a Russian company they are required by Russian law to hand over their encryption keys and any other information the state security agencies ask them for or they cannot operate in Russia. And if the accusations are true than at least we have a couple government agencies doing th

      • by Anonymous Coward

        Agreed. Consider the source. A Russian app maker, who has a Russian app company, where the chance of a backdoor for the Russian spy agencies decides to make some Russian propaganda about US spy agencies.

        If you can't attack a country by force, attack them via propaganda. This worked well in Viet Nam, worked with Iraq 2 and Iran. Daesh, even with all their losses is still going strong, just because they are masters at propaganda.

        Wonder when the US and Europe will learn to not allow the hearts and minds of

    • by mbkennel ( 97636 )
      And of course the FSB would never, ever ever think to """bribe""" a Russian-based employee into a similar arrangement.
      • I think as an American, less harm would come to me from the Rooskies snooping in on my online activities (If i was doing anything worth snooping in on, of course).

        It's not like the Russians would voluntarily share anything with the US government. In fact they'd make a point not to. Now if I were a Russian, living in Russia, I'd be very cautious about using any kind of service based in Russia.

    • FWIW, I wrote and published this steganography/cryptography app:

      http://mangocats.com/stegamail... [mangocats.com]

      and got it registered/approved via "the system." Granted, I did follow all the rules, encryption is only 56 bits, but nobody ever questioned the truth of that claim, nor requested any technical information beyond the simple claim of conformance.

      I'm sure if it were more widely used, like Twitter or something, it would come under closer scrutiny. But, there are two points here:

      First - if you don't use mainstream

  • by SuperKendall ( 25149 ) on Thursday June 15, 2017 @11:09AM (#54626155)

    Step 1) Create messaging app with no users but strong encryption.

    Step 2) Profit from government payoffs!

    Step 3) ENDLESS PROFIT

    • That's a better business plan than some companies out there.
    • Did this, the government doesn't talk to you, much less offer payment, until you have users.

      • This is 2017 man, you don't need actual users, just a trend-line and PROJECTED users.

        If you can't ramp up a nice trend line from India for under $100 well I don't know what to tell you.

        Heck if you spend the $100 to buy users from Russia THAT trend line will have the government BEGGING to give you money to weaken encryption.

  • by davidwr ( 791652 ) on Thursday June 15, 2017 @11:11AM (#54626169) Homepage Journal

    It would be naive to think you can run an independent/secure cryptoapp based in the US.

    Published source makes it a lot easier to spot problems with the code.

    Also, with published source code you can, with the appropriate license, legally recompile it yourself using your own set of tools as a hedge against the publisher's tool-chain or binary-repository being compromised.

    Granted, if your tools (anything from the bare metal on up) is compromised or if you are using it to talk with someone else who is using a different binary, all bets are off.

    • by beelsebob ( 529313 ) on Thursday June 15, 2017 @11:15AM (#54626203)

      Published source makes it a lot easier to spot problems with the code.

      No it doesn't. It has been shown repeatedly that the idea that thousands of people will look at code and magically spot bugs is a myth.

      In practice, people either 1) don't look at the code, or 2) don't have the domain knowledge to know what that very specific function is doing.

      In reality, only the person who write it, and the 1 or 2 people who reviewed it really understand what's going on, and often not even the people who reviewed it.

      • ^ This.

        Try onboarding a new dev into a framework...

      • by alvinrod ( 889928 ) on Thursday June 15, 2017 @11:28AM (#54626293)

        No it doesn't. It has been shown repeatedly that the idea that thousands of people will look at code and magically spot bugs is a myth.

        You don't need thousands of people doing that, and if you had closed code and paid for an audit of it, the auditors wouldn't do that either. But it is by definition easier for me, you, or anyone who actually cares to evaluate open source code because we actually have access to the code.

        If you wanted to evaluate it really well what I'd suggest doing is creating a set of test cases prior to even looking at the code. If something gets caught by some simple black-box tests, it's obviously not very good. Better yet, open source your test cases so then can be reused and built-upon. But since you can access the code, you could also analyze it from a white-box perspective with the test cases and look for any branches or paths that the test cases didn't cover, which may be sources of bugs or intentional tampering.

        Hell, if you want cheap labor, just have an instructor make it a project for a testing or cryptography class. It gives the students something a little more real to work with, as well as the opportunity to get involved with an open source project.

        • What the open code can do is expose backdoors, if they're not cleverly hidden.

          If a crypto app is too long for a thorough review, it's poorly written and probably is hiding a backdoor somewhere.

      • by zifn4b ( 1040588 )

        No it doesn't. It has been shown repeatedly that the idea that thousands of people will look at code and magically spot bugs is a myth.

        If you have bad reviewers, you get bad reviews. Garbage in/garbage out. With quality reviewers, you get quality results. It's a qualitative problem not a quantitative problem.

      • by nine-times ( 778537 ) <nine.times@gmail.com> on Thursday June 15, 2017 @11:38AM (#54626397) Homepage

        Also, it's possible to disguise malicious code to look like it's doing something else (e.g. The Underhanded C Contest [wikipedia.org]). It's entirely possible that intelligence agencies try to insert these kinds of things into open source projects.

        But I don't think that was davidwr's point. I take the statement "Published source makes it a lot easier to spot problems with the code." to be pointing out that it's much ore difficult to identify weaknesses if you're provided a compiled binary, as opposed to having access to the source code. It's not that open source code is a guarantee that someone will spot bugs, but with closed source, you're completely at the mercy of the original developer.

        • There was the attitude in the past that security was best if the code was closed and hidden. Ie, "Security Through Obscurity". However good cryptography does not rely upon secrecy of the methods used, and in the past a lot of things assumed to be secure actually were very sloppy. And we're still making similar mistakes today. Keeping the source closed and hidden is just going to make most knowledgeable people assume there is some security through obscurity going on.

          These days you can get FIPS certificati

          • As a company you can basically describe what you're doing (opening the kimono) to the experts and prove that you're doing things securely.

            And what if they leave some things out of that description?

            Because if we're talking about intelligence agencies compromising developers and having them weaken security in their own products, it doesn't seem like they'd then disclose that weakness to security experts. I posited in the previous post that an intelligence agency might try to get some innocent-looking malicious code into an open source project. So what if that intelligence agency had the cooperation of the developer of a closed source project

            • You don't have to stop with FIPS, you can go above and beyond. Nothing is preventing a higher standard of paranoia.

              The standard is there for the government to use itself - it wants to protect information on its own devices, making sure that its own employees are not using equipment that is easy to snoop on. Because the government does use third party software and equipment, including open source. Of course, the government could actually compromise itself and weaken its own security, so that if a government

      • by Anonymous Coward

        But if the code is closed, it's impossible. How is that better?

        • It's not - the point is it's not worse either.

          • If that was the point then you have no point.
            • No, if that was the point, then the parent poster had no point.

              • Don't be confused by your +5 rating on your initial comment [slashdot.org]. It is a phenomenally fucking stupid thing to say. Plenty of highly qualified people have reviewed the source for various implementations of algorithms when the source code is available. It doesn't matter if there are "only a few" as you put it. What matters is it that it is a well qualified few. There are people who know their shit, and those people have reviewed, for a single example, TrueCrypt [wikipedia.org] and those audits [wikipedia.org] matter. You want to claim that
                • Your comment assumes that no one ever audits closed source implementations of encryption code.

                  You may well be right in the case of random implementations by random small shops (in fact, you probably are), but that really just tells you the well known trope "never implement your own cryptography".

                  In the case of the implementations built into {mac|i|tv|watch}OS, Windows, Android etc; for all that there are potentially bribable people writing those implementations, I'll take them any day over any open source i

                  • There were no glaring obvious I securities in Openssl, but you also don't seem to understand Microsoft at all, for example. With FOSS vulnerabilities are patched post haste. With Microsoft it is "thanks for the feedback. We might fix it someday, but then again probably not." I hope you figure out how all this really works someday, but until then just STFU and stop weakening the security landscape with misinformation. Thanks in advance on behalf of real security experts everwhere.
      • often not even the people who reviewed it.

        I don't know about you but as a software developer, sometimes I don't even understand how my code works.

      • And yet closed source is even worse! How do you audit their code, do you just take the company at its word?

      • Counterpoint: https://github.com/google/oss-... [github.com] Google is fuzzing a whole bunch of open source projects and filing bugs.
    • Published source makes it a lot easier to spot problems with the code.

      Demonstrably false in most circumstances. Just because the code is available does not mean competent people are looking at it and finding bugs. It would be safe to say most open source programs are not being looked at by a lot of eyes beyond the primary developers. You need more than published source code to make it easy to spot bugs. Heck a lot of code is so badly written that it would be easier to re-write from scratch than to find a bug in it.

    • Published source makes it a lot easier to spot problems with the code.

      Who's to say that what is published is what is actually under the hood?

    • Published source makes it a lot easier to spot problems with the code.

      It makes it possible, not easier. When you're looking at the math, it's the next best thing to impossible to notice a weakening of the crypto unless you're a crypt expert. And even if you are a crypto expert, it's not an easy thing to spot.

  • Telegraph found Samuel Morse is still alive?!

    https://en.wikipedia.org/wiki/Samuel_Morse [wikipedia.org]

  • by Gravis Zero ( 934156 ) on Thursday June 15, 2017 @11:23AM (#54626263)

    If the NSA failed to bribe their developers, it doesn't mean they are just going to give up. A bribe is just the most cost effective solution for the long term. Have no doubt that they will seek or even maybe even create a weakness in the application.

  • For real? (Score:5, Insightful)

    by Corbets ( 169101 ) on Thursday June 15, 2017 @11:24AM (#54626271) Homepage

    While I wouldn't be terribly surprised if the various three letter agencies try this... would they really be stupid enough to let him know where they were from? It's not like they would have appealed to the Russian's sense of patriotism for the US.

    On the other hand, this sort of publicity could drive users to his product, providing a motive to lie.

    Methinks that we should remain a bit skeptical on this one.

    • by Anonymous Coward

      Where else would they be from? The dairy industry? They were offered bribes, AND got intimidated by the FBI, it was the U.S. gov alright. Wake up dude.

  • by Anonymous Coward

    I mean, here we got this Russian guy, and then there's Trump, probably offering to buy him out, just as soon as he pays off his other Russian debts.

    The czech is in the mail

  • OpenKeychain (Score:5, Insightful)

    by wasteoid ( 1897370 ) on Thursday June 15, 2017 @11:28AM (#54626291)
    Manage your own asymmetrical PGP keys and encryption, while using any messaging app. Not as integrated as Telegram or other streamlined apps, but secure communication is possible, just need to take a few extra steps.
  • by Nutria ( 679911 ) on Thursday June 15, 2017 @11:32AM (#54626343)

    Maybe. Or maybe you're just *saying* it to make yourself look better while bashing the US.

    How will we ever know for sure?

  • by Anonymous Coward

    There's no way any software of US origin can be trusted - paricularly closed source. The US interferes in the affairs of most countries on earth, including in elections and referendums in my own nation. Now they also want to spy on personal communications, and bribe developers from countries that have less invasive surveilence policies. Time to completely isolate this parriah state in my opinion.

  • by what about ( 730877 ) on Thursday June 15, 2017 @11:37AM (#54626391) Homepage

    Telegram is an alternative to whatsapp or equivalent service from google.
    Hopefully European politicians will not be so dumb to break it... (look out to fake "save the children" broadcast)

    Any person that does any "professional" work must consider the sharing of contacts,documents, communications as a breach of contract with the client.

    I am looking at what happens on the cellphone/tablet market and pray/hope that there will be a NON US based company providing some reasonable platform.

    The amount of information that an Android phone share with Google by default (same with Apple) is way too much, real big brother on steroid.

    • by Anonymous Coward

      Citation needed for the Apple part. Apple
      Is a hardware company that respects your privacy, as they don't collect any data like
      Google does. I've ran wireshark, little snitch and checked my firewall/proxy logs(logs all traffic). No signs of Apple sending my data to their servers for advertising purposes. No I don't use the App Store.

      Google on the other hand, you are the product. They give you free stuff so they can sell you out to their advertising network.

    • by AHuxley ( 892839 )
      Re "Hopefully European politicians will not be so dumb to break it"
      Germany is working hard on that "Germany to pour cash into mass surveillance"
      http://www.dw.com/en/germany-t... [dw.com]
      "..BND says it needs much of the extra money - some 73 million euros over the next few years - to set up "Panos," a new project specifically aimed at decrypting messaging systems by finding weaknesses in the apps."
  • Rabbit hole... (Score:5, Insightful)

    by bradley13 ( 1118935 ) on Thursday June 15, 2017 @11:41AM (#54626423) Homepage

    This may be true, and he didn't accept the bribes; he may be saying this after accepting the bribes; he may be saying this as a publicity stunt; he may be saying this to deflect attention away from the backdoors already installed for the Russian government; he may be saying this because... ...we have no fricking idea. How deep down the rabbit hole do you want to chase your favorite conspiracy theory?

    Granted, it's hard to be prepared for all eventualities, but it sure would be nice if he had a recording of the meeting, and the words exchanged.

    • by Khyber ( 864651 )

      You hit exactly what I was thinking. I'd bet money this is a ploy to get more people to use the app, so that more spying can be done.

      Good thing I don't trust/use apps which require my phone number. That's none of their fucking business.

    • Yup. The problem now is - to what degree can you trust anything?

  • Is it true? (Score:5, Insightful)

    by GuB-42 ( 2483988 ) on Thursday June 15, 2017 @11:43AM (#54626455)

    While bribing developers to weaken encryption is most likely not above what intelligence agencies do, this could also be a PR move.
    By saying an intelligence agency attempted to bribe your devs, it implies that :
    - Your app is so secure that it can't be cracked by external means
    - That your company standards are so high that bribes don't work
    - That the government is watching and using unethical methods, and that an app like the one you offer is needed
    - Competitors may have been bribed too, and if they aren't saying anything, they may have fallen for it

    Considering the flaws of Telegram, this may be just an attempt to make it feel more secure than it really is.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Thursday June 15, 2017 @12:01PM (#54626617)
    Comment removed based on user account deletion
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Technically, it's worth clarifying a few things:
      - It's RSA the security company (peddling their encryption solution), not RSA the algorithm (there continue to be no known problems with the RSA algorithm).
      - What they were paid the $10 mil for is to use the Dual_EC_DRBG random number generator (this is not a cipher, merely a random number generator for one).
      - The Dual_EC_DRBG algorithm was a faulty backdoored standard certified for use by NIST for use in cryptographic software (NIST = the body that certifies

  • Option 1: Could be Russian/Telegram propaganda.
    Option 2: Could be true because seriously, who trusts the FBI/NSA not to violate our privacy anymore?

    Really not sure what to believe about this one.
  • by account_deleted ( 4530225 ) on Thursday June 15, 2017 @12:20PM (#54626765)
    Comment removed based on user account deletion
  • Here in the US, intelligence agencies effectively OWN many tech companies and mandate these weaknesses in engineering to their trusted men (or women) in development teams. Of course I'm talking primarily about Google and Microsoft, and many others besides.

    conspiracy theory blah blah blah

    Too long of an argument to make to defend, but it might suffice to point out how everyone important is invested in every important business and so this mutual investiture forms the mechanism for the 'crazy crazy impossible' conspiracies that many like to c

You know you've landed gear-up when it takes full power to taxi.

Working...