Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy Security Wikipedia News

Wikimedia Foundation Enables HTTPS For All Projects 69

An anonymous reader writes "The Wikimedia Foundation has enabled HTTPS for all of its projects (Wikipedia, Wikimedia Commons, etc.), to enable secure log-in and browsing privacy. Their blog post goes into detail about how the service is configured, linking to configuration files and implementation documentation. It also mentions that HTTPS Everywhere will have updated rules for this change soon."
This discussion has been archived. No new comments can be posted.

Wikimedia Foundation Enables HTTPS For All Projects

Comments Filter:
  • What (Score:1, Troll)

    by stephanruby ( 542433 )

    And this is news???

  • by Anonymous Coward

    We had a recent story 2 weeks ago (http://tech.slashdot.org/story/11/09/20/1833232/hackers-break-browser-ssltls-encryption [slashdot.org]) warning us that anything less than TLS 1.1 (aka SSL 3.2) is easily decrypted, but that TLS 1.1 and TLS 1.2 (aka SSL 3.3) aren't widely adopted by servers OR web browsers.

    So the question is: does Wikimedia use TLS 1.2 (or 1.3), or are they trying to lull people into a false sense of security?

    • en.wikipedia.org is currently using TLS 1.0.
      • by chill ( 34294 )

        Chicken, meet egg.

        What browser are you using that supports TLS 1.1 or 1.2? IE 8 doesn't. I don't know about 9. Firefox doesn't -- it depends on OpenSSL and the release version of that product doesn't support TLS > 1.0.

          1. Both IE8 and IE9 do support TLS1.1 if run on OS that supports it (read Vista, 7, or server editions of those)
          2. Windows 7 brings support for TLS1.2
          3. Firefox doesn't use OpenSSL, it uses Mozilla's NSS
          4. Newest OpenSSL does have support for TLS1.2, albeit testing only
  • by waddgodd ( 34934 ) on Monday October 03, 2011 @11:16PM (#37596002) Homepage Journal

    Of course, wait until after the persistent TLS1.0 connection bug gets exploited. Because, you know, nothing says "we care about security" quite as much as making available an exploited protocol.

  • by Frosty Piss ( 770223 ) * on Monday October 03, 2011 @11:18PM (#37596006)

    Sure. When I look up "Dog Poop Girl [wikipedia.org]" I need to make sure the government isn't tracking it...

  • by Anonymous Coward on Monday October 03, 2011 @11:19PM (#37596016)

    It only takes one CA being compromised to compromise THE ENTIRE SYSTEM of TLS / SSL...
    DigiNotar.
    Additionally: *.* cert... <- WTF, who's brilliant idea WAS that feature?!

    Fact: The biggest problem with the CA system is that any CA can create a cert for ANY DOMAIN even if the domain owner doesn't request the cert first.

    Thus, EVERY CA must be 100% secure 100% of the time. TLS / SSL isn't a system that has a single point of failure... It's a system that has many Hundreds of points of failure; Any one of them being enough to cause the whole trust model to fall apart like so many cards stacked in the shape of a house.

    Your browser probably doesn't trust DigiNotar, but does it trust CNNIC?
    http://yro.slashdot.org/story/10/02/02/202238/mozilla-accepts-chinese-cnnic-root-ca-certificate

    FF: Tools/Edit > Options/Preferences > Advanced > Encryption > View Certificates

    You trust ALL OF THESE?! Well, enjoy your security theater suckers.

    • So could we do something similar to SPF/DomainKeys? You create a public key that you advertise via DNS and require the private key to be uploaded to the CA to get the certificate? That would ensure only the domain owner could request the SSL certificate.

      • How much do you trust your DNS registrar?

        Not to mention, DNS can be spoofed (maybe someday it won't be, but ultimately you have to trust someone).
        • by dkf ( 304284 )

          How much do you trust your DNS registrar?

          What's more, if your DNS registrar is crooked (or broken into), you're stuffed because you can't go to someone else nearly as easily as with a CA.

    • by Anonymous Coward

      I just love reading the same post, more or less, in every article.

      • by bertok ( 226922 )

        Because he's got a good point that the internet community has been ignoring until the Diginotar fiasco. It wasn't that obvious a problem for most people, it was just one of those things that happens "behind the scenes", and nobody except some paranoid security researchers cared.

        But really, think about it for a second: why are we allowing a country-specific CAs to issue certificates for a TLD other than their country TLD?

        I can understand why a Root CA certificate doesn't have any restrictions in it (that wou

        • But really, think about it for a second: why are we allowing a country-specific CAs to issue certificates for a TLD other than their country TLD?

          What are the non-country-specific CAs, then? Every company is registered in some country. Being registered in the US doesn't make it less "country-specific".

          Unless you propose to eliminate the gTLDs, I don't see why would only some CAs have the power to sign for them.

    • by phantomfive ( 622387 ) on Tuesday October 04, 2011 @01:18AM (#37596394) Journal
      You do realize that this has been a problem from the beginning, right? If you sound surprised, it's only because you only recently started paying attention.

      In practice, there are multiple layers of security, and this is just one of them.
      • by bertok ( 226922 )

        In practice, there are multiple layers of security, and this is just one of them.

        There really isn't. For web SSL/TLS, there's exactly one layer of trust: the certificate authorities.

        There's no other check that the browser performs. If a trusted CA signed a cert, it hasn't expired, and it's not in a revocation list (maintained by the CA), then it's OK.

        That's it.

        • There's no other check that the browser performs.

          My browser has Perspectives [mozilla.org] and Certificate Patrol [mozilla.org]. This way I know if other network locations are seeing the same cert that I'm seeing, and whether that cert's changed recently.

        • In addition to what the other poster has mentioned, you can also verify visually that the website you are connecting to is the one you expect. You can use temporary credit card numbers. No method of security is perfect, but if you thought the CA system was, then you were fooling yourself. All security breaks down at trust.
      • Yes, but this is the layer which causes end users browsers to throw a yellow screaming fit if they try to use an encrypted connection outside of the CA club.

      • In practice, there are multiple layers of security

        In a normal SSL web browser configuration there are exactly two layers of security, SSL and the security of the underlying network you are using. Break both of those and you can set up as a man in the middle and sniff the user's data.

        You do realize that this has been a problem from the beginning, right?

        However it is a problem that has got worse over time for several reasons.

        Firstly the list of trusted CAs has been ever growing both through the addition of root certs to browsers and through the issuance of "intermediate certs" by the existing CAs. How many people know that the

    • by intiha ( 1646093 )

      http://yro.slashdot.org/story/10/02/02/202238/mozilla-accepts-chinese-cnnic-root-ca-certificate [slashdot.org]

      FF: Tools/Edit > Options/Preferences > Advanced > Encryption > View Certificates

      You trust ALL OF THESE?! Well, enjoy your security theater suckers.

      Just checked this out... Damn, I have a gazillion cert authorities from all over the world, in languages I can even recognize. So What THE F**K should we do? any reliable tool to keep working with cert authorities and trusting the green icon on FF. Please some one tell me (is there a FF extension that can help weed out the unsavory CA). This is a shame, since ordinary people were finally getting the message that "Look at the green icon/key/lock before you trust a website", and now that security has proven

  • Fixed link (Score:4, Informative)

    by subreality ( 157447 ) on Monday October 03, 2011 @11:29PM (#37596046)
  • by Anonymous Coward on Monday October 03, 2011 @11:57PM (#37596142)

    Whoa, this is an incredibly neat deed for many wiki-editors out there, including myself. Ever since a neighbouring government passing all my foreign-bound data decided to start reading all my IP traffic [wikipedia.org] to build a comprehensive sociogram of my believes, affiliations and interests, I became increasingly paranoid and afraid of expressing myself online on foreign sites. I tried using secure.wikimedia.org, but the site had unsatisfactory stability and responsiveness compared to the unencrypted site. So I just continued using the unencrypted site, but avoiding sensitive topics.

    I hope this decision finally enables us to use Wikipedia even for editing sensitive topics, and more importantly hiding our wiki-identity from the government. Kudos to the Wikimedia technical team, you are doing a great job!

  • Public trust in the security of HTTPS and SSL certificate authorities is at a literally unprecedented level right now.

    • If the choice is being exposed to a passive sniffer vs being exposed to those prepared to perform man in the middle attacks (which carry a far higher risk of getting caught for the attacker) I'd go for the latter.

  • Now I have to remember my damn wikipedia password.

  • by Anonymous Coward on Tuesday October 04, 2011 @02:38AM (#37596656)

    So, when will slashdot follow? Currently https://slashdot.org just redirects to http://slashdot.org

    • Good question. As a geek site, Slashdot should be a pioneer in these things. Full Unicode character support has also been missing for a long time. The box for notification messages on the front page feels a bit old too, something like the Facebook globe icon could be more sleek. Different color themes. Things like that.
    • It's supported, but only available to subscribers. If you're not logged on as a subscriber, it redirects you to the insecure version.

      Nice touch, eh?

  • How much extra juice does it take for masses of GMail and Wiki and Facebook servers to do the work to encrypt all this data (plus the end use machines)?
    • by heypete ( 60671 ) <pete@heypete.com> on Tuesday October 04, 2011 @03:08AM (#37596720) Homepage

      Not much [imperialviolet.org]:

      In January this year (2010), Gmail switched to using HTTPS for everything by default. Previously it had been introduced as an option, but now all of our users use HTTPS to secure their email between their browsers and Google, all the time. In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that.

      • by Threni ( 635302 )

        Perhaps the browsers can soon start warning users that they've about to visit an insecure site and ask if they wish to continue?

        • I seriously hope not. SSL adds latency to the connection and is completely useless for a huge number of websites. Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

          • by vlm ( 69642 ) on Tuesday October 04, 2011 @07:22AM (#37597546)

            I seriously hope not. SSL adds latency to the connection and is completely useless for a huge number of websites. Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

            You want to cook a non-Halal recipe in a Halal nation where improper religious observation will get you killed? Really simple example would be looking up mixed-drinks cocktails in Saudi Arabia...

          • by Anonymous Coward

            Your reasoning is valid when the only goal is to protect sensitive information, such as banking details or what kind of horse-porn you enjoy.

            When looking at the bigger picture, simply seeing the encrypted stream and where it's headed is enough to build a reasonably accurate image of your online habits. I recently heard Stefan Burschka talk about his work with mining encrypted VOIP traffic to determine the content - he does this with an accuracy of 66% per sentence just by knowing the timing between packets

          • by Threni ( 635302 )

            > Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

            No no, you've got it backwards. Why shouldn't all communication be secure and encrypted? Latency isn't important unless you're gaming; certainly not the minuscule 1% or whatever it is SSL costs. It's no-one's business which sites you go to and what you do when you get there.

            • HTTPS doesn't prevent people from knowing which sites you go to - not only they have its IP, as SNI [wikipedia.org] means the domain is sent in cleartext to make sure the server sends the right certificate.

              Domain isn't that irrelevant when you have a bunch of different domains to load on a single webpage, which is extremely common nowadays.

    • by ledow ( 319597 )

      Nowhere near 0.1% of the extra power, bandwidth, etc. caused by compromised machines, virus-ridden-spam-hosts, etc. that *can* be caused by visiting sites that you *think* are Google, Wikipedia, Facebook etc. when in fact they're not.

      And anybody who worries about the environment impact of an SSL enablement really needs to go see the size and power requirements of the average ISP's datacentre, let alone someone like Google.

      The problem with greenies is *not* their intentions, it's the fact that they choose TO

  • I can imagine that the other compromised CAs by the comodo hacker [slashdot.org] have made Wikipedia an offer they couldn't refuse.
  • I was wondering - Âhow much stress does enabling HTTPs on a huge site like Wikipedia puts on a modern web server? IIRC this was one of the reasons Facebook took quite a while to enable SSL for their users.

    • It depends on your network infrastructure, especially how CPU-intensive your content is already.

      Google's numbers are a 2% increase in network traffic, a 1% increase in CPU usage, and 10kb RAM per connection. Your network numbers will go up if you've got SSL frontend servers talking to content backend servers (Wikipedia's solution), while your CPU numbers will go *way* up if most of what you're serving is static content (this is where SSL's reputation as a CPU hog came from; these days, almost everyone serv

  • I use HTTPS everywhere, but it sends me to an experimental search page for google that lacks the standard tabs. I mostly want standard tabs, so this is annoying.

  • Encrypted connections can't be cached by a proxy, unless the proxy acts as a man-in-the-middle. While this is popular at many companies, I don't see a lot of support for your ISP doing it.

    SSL Everywhere, if successful, will be the death of caching. Is that a good thing?

  • Considering all of the compromised SSL certificates, you may not be any more private with this change than before.
  • Nice to have Wikipedia running on Lativan version of standard HTTP protocol (https). But then again, I am an Estonian who commented on the item 2 days late.

Trap full -- please empty.

Working...