Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Chrome Privacy IT Technology

Google Chrome To Support Same-Site Cookies, Get Anti-Fingerprinting Protection (zdnet.com) 57

Google plans to add support for two new privacy and security features in Chrome, namely same-site cookies and anti-fingerprinting protection. From a report: The biggest change that Google plans to roll out is in regards to how it treats cookie files. These new controls will be based on a new IETF standard that Chrome and Mozilla developers have been working on for more than three years. This new IETF specification describes a new attribute that can be set inside HTTP headers. Called "SameSite," the attribute must be set by the website owner and should describe the situations in which a site's cookies can be loaded.

[...] Google engineers also announced a second major new privacy feature for Chrome. According to Google, the company plans to add support for blocking certain types of "user fingerprinting" techniques that are being abused by online advertisers. Google didn't go into details of what types of user fingerprinting techniques it was planning to block. It is worth mentioning that there are many, which range from scanning locally installed system fonts to abusing the HTML5 canvas element, and from measuring a user's device screen size to reading locally installed extensions.

This discussion has been archived. No new comments can be posted.

Google Chrome To Support Same-Site Cookies, Get Anti-Fingerprinting Protection

Comments Filter:
  • by DarkRookie2 ( 5551422 ) on Wednesday May 08, 2019 @04:13PM (#58560308)
    It is the antithesis of there business model.
    • Yeah, but pretending they do care and doing something 1/100-assed way stalls others' work and, especially, legislation.

    • No it's not. (Score:3, Insightful)

      by Anonymous Coward

      Don't be dense. Google makes money from targeted advertising. These features limit their competitors' ability to get the demographic data, so that Google can become even more of a monopoly on an even more scarce resource: your data.

      In case that wasn't clear enough: Google still gets to know everything about you. It does not need cross-site cookies or browser fingerprinting to do so. Only Google's competitors in the information market rely on those features. Google is not doing this to give you a drop o

      • This. I came here to say pretty much the same thing: it's about competitive advantage as Chrome is just another Google product released to enhance Google's data quality. (I assume parent is at mod:0 because posted by an AC.)
    • This is obvious fake news.

      If Big Brother Google is involved in anything, then it is fully certain that thing is harmful to privacy, freedom, solidarity, and/or democracy.

    • by AmiMoJo ( 196126 )

      Privacy is core to Google's business model.

      Their advertising business is so valuable because of their ability to target ads well. If your data is not kept private, everyone gains that ability and their service is devalued.

      Thus, it is important for Google to protect your privacy.

  • So you don't hear incorrect speech.

  • About time (Score:5, Funny)

    by Ol Olsoc ( 1175323 ) on Wednesday May 08, 2019 @04:38PM (#58560416)
    Google should have been supporting Same-Sex cookies a long time ago!
  • by FudRucker ( 866063 ) on Wednesday May 08, 2019 @04:49PM (#58560472)
    It's a browser feature few users will have heard of, but forthcoming versions of Chrome, Safari and Opera are in the process of removing the ability to disable a long-ignored tracking feature called hyperlink auditing pings. From a report: This is a long-established HTML feature that's set as an attribute -- the ping variable -- which turns a link into a URL that can be tracked by website owners or advertisers to monitor what users are clicking on. When a user follows a link set up to work like this, an HTTP POST ping is sent to a second URL which records this interaction without revealing to the user that this has happened. It's only one of several ways users can be tracked, of course, but it's long bothered privacy experts, which is why third-party adblockers often include it on their block list by default. Until now, an even simpler way to block these pings has been through the browser itself, which in the case of Chrome, Safari and Opera is done by setting a flag (in Chrome you type chrome://flags and set hyperlink auditing to 'disabled'). Notice, however, that these browsers still allow hyperlink auditing by default, which means users would need to know about this setting to change that. It seems that very few do.

    from the article: Betas of Chrome 74 (which ships later this month) have dropped this flag, as has Opera which is built on the same Chromium engine and has shadowed the change in its developer builds.
    • Websites already control the link you are clicking on their site, they can use javascript to track clicks or just route all links though a redirect to track (v common, obscures the link).. pings actually improve things by putting this under browser control - so you can have real links and a site that works without JS... yet "privacy experts" try to cast this as a bad thing.

    • by AmiMoJo ( 196126 )

      They are removing it because doing so actually improves privacy and performance for users.

      If hyperlink auditing is not available the site owners just use Javascript to do the same thing. Javascript means larger page downloads and more wasted CPU cycles. Unlike hyperlink audit pings the browser can't optimize them effectively, e.g. by making them async background tasks while the actual page you requested loads.

      Hyperlink auditing pings can be blocked by ad blockers, and in fact are even easier to block than t

  • by zenbi ( 3530707 ) <bryan@pipedot.org> on Wednesday May 08, 2019 @05:43PM (#58560732) Homepage
    Same site cookies? The domain attribute for cookies is already part of the spec [faqs.org] (Section 5.2.3) why do they need another of the same?
    • Same site cookies? The domain attribute for cookies is already part of the spec (Section 5.2.3) why do they need another of the same?

      This controls domains the browser is allowed to transmit a cookie for. For example session cookie should be accessible to subdomain.mywebapp.org .

      What this is about is different from that. Say you own a website with a login form that uses session cookies. You happen to be logged into it. Now imagine you are on Google and do a search for your website. With this change when you follow that hyperlink to your own website it won't see you as logged in because the browser will be prevented from transmitting

      • The proposal is for default-default of "Lax" mode which would not break in the way you describe since lax sends cookies for top level nav via safe HTTP method (GET is safe). If someone sets their default mode to Strict then that is their problem (or your problem if your website set strict on the cookie).

        There are scenarios Lax would break (by design), eg cross site form submission via POST, but following a link isn't one of them.

  • by WaffleMonster ( 969671 ) on Wednesday May 08, 2019 @07:07PM (#58561144)

    SameSite has been supported in Chrome for years. In fact TFA even mentions this.

    What seems to be new and very much newsworthy is the attempt by Google to impose breaking global changes in existing behavior.

    It's May 8th and a draft adopted by no WG that is not even a day old is being cited in news articles. Why? Lets look at that draft.

    Based on conversations at [HTTP-Workshop-2019] and elsewhere, I'd
    suggest that we have something like agreement on at least two
    principles:

    1. HTTP requests should not carry state along with cross-site
    requests by default (see Section 8.2 of [RFC6265bis]).

    2. HTTP requests should not carry state over non-secure channels
                  (see Section 8.3 of [RFC6265bis], and [RFC7258]).

    This is breathtaking. Commentary in the context of a completely different state management system is being assigned to justify *breaking changes* in EXISTING production mechanism in production use globally.

    The conclusions themselves are cracked pottery at best.

    1. So decrees king Google who doesn't have to answer to anyone.

    2. WTF? This is what the secure flag is for. People already have the ability to decide whether this should or should not occur.

    With those principles in mind, this document proposes two changes
          that seem possible to deploy in the near-term. User agents should:

          1. Treat the lack of an explicit "SameSite" attribute as
                  "SameSite=Lax".

    I'm all for headers that impose additional constraints. This isn't that. It's breaking existing systems globally and unilaterally for no good articulated reason.

    • CSRF the reason, and it is a big pressing problem. Cookies are a horrible security mess, and ultimately something will have to broken, whether this particular change does enough good to justify the breaking is a more nuanced question.

      • CSRF the reason

        No shit.

        and it is a big pressing problem.

        Citation required. Regardless of characterization of problem scope solutions that don't break anything are both well known and trivial to implement.

        Cookies are a horrible security mess

        No. I R A dev crowd who don't know what they are doing = "horrible security mess".

        , and ultimately something will have to broken,

        Keyboards? Digitizers? Car windows? Supermarket PA systems?

        whether this particular change does enough good to justify the breaking is a more nuanced question.

        I'll leave this stunning lack of objective basis speak for itself.

    • by AmiMoJo ( 196126 )

      It's breaking existing systems globally and unilaterally for no good articulated reason.

      The reason is spelt out quite clearly in the references the document cites: Security.

      Without the SameSite attribute cookies are always sent when a site is loaded, and that means if you are, say, logged in to Slashdot another site can send you a link to Slashdot that triggers some action which will be executed when you click on it.

      By making all cookies have a default setting of "lax" this attack is blocked.

      It breaks almost nothing except for malicious links. It's only very, very broken sites that rely on thi

      • The reason is spelt out quite clearly in the references the document cites: Security.

        Without the SameSite attribute cookies are always sent when a site is loaded, and that means if you are, say, logged in to Slashdot another site can send you a link to Slashdot that triggers some action which will be executed when you click on it.

        This has been well understood property of web for decades from the very beginning. This is why sites deploy measures to prevent this from happening.

        One of the easiest methods for those unwisely electing only to use cookies is simply to check HTTP_REFERER header. Sites could also use the SameSite header if they want crummy half-assed protection that SameSite offers and don't feel like implanting it themselves. None of this is rocket science.

        By making all cookies have a default setting of "lax" this attack is blocked.

        It breaks almost nothing except for malicious links. It's only very, very broken sites that rely on this to work.

        **BULLSHIT**

        I personally have a site that employs CSRF protections

  • Comment removed based on user account deletion

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...