AI

'AI-Powered Remediation': GitHub Now Offers 'Copilot Autofix' Suggestions for Code Vulnerabilities (infoworld.com) 18

InfoWorld reports that Microsoft-owned GitHub "has unveiled Copilot Autofix, an AI-powered software vulnerability remediation service."

The feature became available Wednesday as part of the GitHub Advanced Security (or GHAS) service: "Copilot Autofix analyzes vulnerabilities in code, explains why they matter, and offers code suggestions that help developers fix vulnerabilities as fast as they are found," GitHub said in the announcement. GHAS customers on GitHub Enterprise Cloud already have Copilot Autofix included in their subscription. GitHub has enabled Copilot Autofix by default for these customers in their GHAS code scanning settings.

Beginning in September, Copilot Autofix will be offered for free in pull requests to open source projects.

During the public beta, which began in March, GitHub found that developers using Copilot Autofix were fixing code vulnerabilities more than three times faster than those doing it manually, demonstrating how AI agents such as Copilot Autofix can radically simplify and accelerate software development.

"Since implementing Copilot Autofix, we've observed a 60% reduction in the time spent on security-related code reviews," says one principal engineer quoted in GitHub's announcement, "and a 25% increase in overall development productivity."

The announcement also notes that Copilot Autofix "leverages the CodeQL engine, GPT-4o, and a combination of heuristics and GitHub Copilot APIs." Code scanning tools detect vulnerabilities, but they don't address the fundamental problem: remediation takes security expertise and time, two valuable resources in critically short supply. In other words, finding vulnerabilities isn't the problem. Fixing them is...

Developers can keep new vulnerabilities out of their code with Copilot Autofix in the pull request, and now also pay down the backlog of security debt by generating fixes for existing vulnerabilities... Fixes can be generated for dozens of classes of code vulnerabilities, such as SQL injection and cross-site scripting, which developers can dismiss, edit, or commit in their pull request.... For developers who aren't necessarily security experts, Copilot Autofix is like having the expertise of your security team at your fingertips while you review code...

As the global home of the open source community, GitHub is uniquely positioned to help maintainers detect and remediate vulnerabilities so that open source software is safer and more reliable for everyone. We firmly believe that it's highly important to be both a responsible consumer of open source software and contributor back to it, which is why open source maintainers can already take advantage of GitHub's code scanning, secret scanning, dependency management, and private vulnerability reporting tools at no cost. Starting in September, we're thrilled to add Copilot Autofix in pull requests to this list and offer it for free to all open source projects...

While responsibility for software security continues to rest on the shoulders of developers, we believe that AI agents can help relieve much of the burden.... With Copilot Autofix, we are one step closer to our vision where a vulnerability found means a vulnerability fixed.

Social Networks

Preparing to Monetize, Threads Launches New Tools for Users (axios.com) 17

"We're testing a few new ways to plan and manage your presence on Threads," announced top Threads/Instagram executive Adam Mosseri, promising their 200 million-plus users "enhanced insights to help you better understand your followers and how posts perform, and the ability to save multiple drafts with scheduling coming soon."

Axios reports: Helping creators avoid burnout has become a growing focus for Meta CEO Mark Zuckerberg, who said in July that the company's new generative AI tools can alleviate certain tasks like communicating with followers. Thursday's announcement was positioned as helping both businesses and creators — suggesting that Meta is ramping up plans to start monetizing Threads, which could be as early as this year.
Transportation

US Presses the 'Reset Button' On Technology That Lets Cars Talk To Each Other (npr.org) 95

An anonymous reader quotes a report from NPR: Safety advocates have been touting the potential of technology that allows vehicles to communicate wirelessly for years. So far, the rollout has been slow and uneven. Now the U.S. Department of Transportation is releasing a roadmap it hopes will speed up deployment of that technology -- and save thousands of lives in the process. "This is proven technology that works," Shailen Bhatt, head of the Federal Highway Administration, said at an event Friday to mark the release of the deployment plan (PDF) for vehicle-to-everything, or V2X, technology across U.S. roads and highways. V2X allows cars and trucks to exchange location information with each other, and potentially cyclists and pedestrians, as well as with the roadway infrastructure itself. Users could send and receive frequent messages to and from each other, continuously sharing information about speed, position, and road conditions -- even in situations with poor visibility, including around corners or in dense fog or heavy rain. [...]

Despite enthusiasm from safety advocates and federal regulators, the technology has faced a bumpy rollout. During the Obama administration, the National Highway Traffic Safety Administration proposed making the technology mandatory on cars and light trucks. But the agency later dropped that idea during the Trump administration. The deployment of V2X has been "hampered by regulatory uncertainty," said John Bozzella, president and CEO of the Alliance for Automotive Innovation, a trade group that represents automakers. But he's optimistic that the new plan will help. "This is the reset button," Bozzella said at Friday's announcement. "This deployment plan is a big deal. It is a crucial piece of this V2X puzzle." The plan lays out some goals and targets for the new technology. In the short-term, the plan aims to have V2X infrastructure in place on 20% of the National Highway System by 2028, and for 25% of the nation's largest metro areas to have V2X enabled at signalized intersections. V2X technology still faces some daunting questions, including how to pay for the rollout of critical infrastructure and how to protect connected vehicles from cyberattack. But safety advocates say it's past time to find the answers.

The Almighty Buck

US Fines T-Mobile $60 Million, Its Largest Penalty Ever, Over Unauthorized Data Access (reuters.com) 12

The U.S. Committee on Foreign Investment (CFIUS) fined T-Mobile $60 million, its largest penalty ever, for failing to prevent and report unauthorized access to sensitive data tied to violations of a mitigation agreement from its 2020 merger with Sprint. "The size of the fine, and CFIUS's unprecedented decision to make it public, show the committee is taking a more muscular approach to enforcement as it seeks to deter future violations," reports Reuters. From the report: T-Mobile said in a statement that it experienced technical issues during its post-merger integration with Sprint that affected "information shared from a small number of law enforcement information requests." It stressed that the data never left the law enforcement community, was reported "in a timely manner" and was "quickly addressed." The failure of T-Mobile to report the incidents promptly delayed CFIUS' efforts to investigate and mitigate any potential harm to U.S. national security, they added, without providing further details. "The $60 million penalty announcement highlights the committee's commitment to ramping up CFIUS enforcement by holding companies accountable when they fail to comply with their obligations," one of the U.S. officials said, adding that transparency around enforcement actions incentivizes other companies to comply with their obligations.
The Almighty Buck

Smart Sous Vide Cooker To Start Charging Monthly Fee For 10-Year-Old Companion App (arstechnica.com) 122

An anonymous reader quotes a report from Ars Technica: Anova, a company that sells smart sous vide cookers, is getting backlash from customers after announcing that it will soon charge a subscription fee for the device's companion app. Anova was founded in 2013 and sells sous vide immersion circulators. Its current third-generation Precision Cooker 3.0 has an MSRP of $200. Anova also sells a $149 model and a $400 version that targets professionals. It debuted the free Anova Culinary App in 2014. In a blog post on Thursday, Anova CEO and cofounder Stephen Svajian announced that starting on August 21, people who sign up to use the Anova Culinary App with the cooking devices will have to pay $2 per month, or $10 per year. The app does various things depending on the paired cooker, but it typically offers sous vide cooking guides, cooking notifications, and the ability to view, save, bookmark, and share recipes. The subscription fee will only apply to people who make an account after August 21. Those who downloaded the app and made an account before August 21 won't have to pay. But everyone will have to make an account; some people have been using the app without one until now.

"You helped us build Anova, and our intent is that you will be grandfathered in forever," Svajian wrote. According to Svajian, the subscription fees are necessary so Anova can "continue delivering the exceptional service and innovative recipes" and "maintain and enhance the app, ensuring it remains a valuable resource." As Digital Trends pointed out, the announcement follows an Anova statement saying it will no longer let users remotely control their kitchen gadgets via Bluetooth starting on September 28, 2025. This means that remote control via the app will only be possible for models offering and using Wi-Fi connectivity. Owners of affected devices will no longer be able to access their device via the Anova app, get notifications, or use status monitoring. Users will still be able to manually set the time, temperature, and timer via the device itself.

Cellphones

T-Mobile Shutting Down 2G Network Beginning Next Month (tmo.report) 28

"T-Mobile will be shutting down their 2G network beginning next month, making older phones obsolete," writes Slashdot reader Dustin Destree. From the Mobile Report: Most phones today use 4G and 5G, and T-Mobile's 2G service somehow managed to outlive the company's 3G service, which was killed off in 2022. Nonetheless, after postponing a previous shutdown date of April 2nd, we seem to finally have a date for T-Mobile sunsetting its 2G service, and it's pretty soon. T-Mobile has added a date for when its 2G service's capacity and coverage is "expected to change." The service should begin shutdown on September 1st, 2024. The date was quietly added without a major announcement, and it was added sometime after August 5th, as a former Google cache of the page (which has now also been updated) previously showed.
Ubuntu

Ubuntu Will Start Shipping With the Latest Upstream Linux Kernel - Even Release Candidates (omgubuntu.co.uk) 31

Here's a question from the blog OMG Ubuntu. "Ever get miffed reading about a major new Ubuntu release only to learn it doesn't come with the newest Linux kernel?

"Well, that'll soon be a thing of the past." Canonical's announced a big shift in kernel selection process for future Ubuntu release, an "aggressive kernel version commitment policy" pivot that means it will ship the latest upstream kernel code in development at the time of a new Ubuntu release.

Yes, even if that upstream kernel hasn't yet seen a formal stable release (and received the requisite newspaper-graphic-topped rundown on this blog). Which is a huge change. Currently, new Ubuntu releases include the most recent stable Linux kernel release at the time of the kernel freeze milestone in the Ubuntu development cycle.

Here's the official announcement by Canonical's Brett Grandbois. "Ubuntu will now ship the absolute latest available version of the upstream Linux kernel at the specified Ubuntu release freeze date, even if upstream is still in Release Candidate status..." It is actually expected that Late Releases will be the exception rather than the norm and in most releases these guidelines will not be necessary as the upstream kernel will release with enough time for the Ubuntu kernel to stabilize. However, adopting a more aggressive kernel version commitment policy does require us to be prepared for a possible Late Release situation and therefore informing the community on what they can expect.
First Person Shooters (Games)

DOOM and DOOM 2 Getting New Enhanced Versions (ign.com) 63

QuakeCon 2024 kicks off today with the announcement of enhanced remasters of the first two Doom games, which will feature online cross-platform deathmatch, co-op support for up to 16 players, upgraded visuals, and additional content including a new episode for Doom 2. The compilation, titled DOOM + DOOM 2, runs on the KEX Engine and will include new maps, a newly-updated soundtrack, and support for 4K resolution, with a possible free upgrade for existing console owners. IGN reports: While unconfirmed, it appears that console owners who already own DOOM or DOOM 2 will get the upgrade for free. It's currently available for purchase on Xbox Series X/S for $9.99, though it has been pulled from the Steam store. When it's released, DOOM + DOOM 2 will be perhaps the most comprehensive version of the venerable shooters to date. You can watch the trailer here.
Television

Netflix To Hike Price Again By December, Jefferies Says 109

In a note to clients, seen by Slashdot, brokerage house Jefferies writes: Netflix's last price hike on the standard plan was in Jan 2022, its ad- supported plan remains the cheapest (among major players) in the industry, and its move into live sports increases pricing power - for these 3 reasons we suspect a price hike in Q4 or December of this year could be coming on the standard plan.

As stated in the Q4 2023 letter (following the announcement of WWE Raw coming in 2025): "As we invest in and improve Netflix, we'll occasionally ask our members to pay a little extra to reflect those improvements, which in turn helps drive the positive flywheel of additional investment." We believe Netflix has been positioning itself throughout this year for a year-end price hike. December / 2025 will have major content releases supporting a pricing increase including the Christmas NFL game, Squid Game 2 on Dec. 26th (season 1 - the #1 watched NFLX show of all time), WWE Raw starting Jan 2025, and Stranger Things 5 coming in 2025 (season 3 / 4 in top 10 of all-time).
The Almighty Buck

iPhone Driver's License Support Coming Soon To California (macrumors.com) 60

iPhone and Apple Watch users in California will soon be able to add their digital ID and driver's license to the Wallet app, as revealed by new landing pages on the state DMV website. This feature follows a slow rollout since its announcement, with only five states currently supporting it. MacRumors reports: "Now you can add your California driver's license or state ID to Apple Wallet on iPhone and Apple Watch so you can present it easily and securely in person and in app," reads the landing page, which contains broken links and placeholder images, and is still missing a proper website security certificate. The webpages were discovered on Sunday by Jimmy Obomsawin, after someone added a link to the landing pages in an Apple Wallet Wikipedia entry last Wednesday.
Privacy

NFL to Roll Out Facial Authentication Software to All Stadiums, League-Wide (therecord.media) 72

America's National Football League "is the latest organization to turn to facial authentication to bolster event security," reports the Record, citing a new announcement this week: All 32 NFL stadiums will start using the technology this season, after the league signed a contract with a company that uses facial scans to verify the identity of people entering event venues and other secure spaces.

The facial authentication platform, which counts the Cleveland Browns' owners as investors, will be used to "streamline and secure" entry for thousands of credentialed media, officials, staff and guests so they can easily access restricted areas such as press boxes and locker rooms, Jeff Boehm, the chief operating officer of Wicket, said in a LinkedIn post Monday. "Credential holders simply take a selfie before they come, and then Wicket verifies their identity and checks their credentials with Accredit (a credentialing platform) as they walk through security checkpoints," Boehm added.

Wicket technology was deployed in a handful of NFL stadiums last year as part of a pilot program. Other stadiums will start rolling it out beginning on Aug. 8, when the pre-season kicks off. Some teams also have extended their use of the technology to scan the faces of ticket holders. The Cleveland Browns, Atlanta Falcons and New York Mets all have used the company's facial authentication software to authenticate fans with tickets, according to Stadium Tech Report. "Fans come look at the tablet and, instantly, the tablet recognizes the fan," Brandon Covert, the vice president of information technology for the Cleveland Browns, said in a testimonial appearing on Wicket's website. "It's almost a half-second stop. It's not even a stop — more of a pause."

"The Browns also use Wicket to verify the ages of fans purchasing alcohol at concession stands, according to Wicket's LinkedIn page," the article points out.

And a July report from Privacy International found that 25 of the top 100 soccer stadiums in the world are already using facial recognition technology.

Thanks to long-time Slashdot reader schwit1 for sharing the news.
Programming

Go Tech Lead Russ Cox Steps Down to Focus on AI-Powered Open-Source Contributor Bot (google.com) 12

Thursday Go's long-time tech lead Russ Cox made an announcement: Starting September 1, Austin Clements will be taking over as the tech lead of Go: both the Go team at Google and the overall Go project. Austin is currently the tech lead for what we sometimes call the "Go core", which encompasses compiler toolchain, runtime, and releases. Cherry Mui will be stepping up to lead those areas.

I am not leaving the Go project, but I think the time is right for a change... I will be shifting my focus to work more on Gaby [or "Go AI bot," an open-source contributor agent] and Oscar [an open-source contributor agent architecture], trying to make useful contributions in the Go issue tracker to help all of you work more productively. I am hopeful that work on Oscar will uncover ways to help open source maintainers that will be adopted by other projects, just like some of Go's best ideas have been adopted by other projects. At the highest level, my goals for Oscar are to build something useful, learn something new, and chart a path for other projects. These are the same broad goals I've always had for our work on Go, so in that sense Oscar feels like a natural continuation.

The post notes that new tech lead Austin Clements "has been working on Go at Google since 2014" (and Mui since 2016). "Their judgment is superb and their knowledge of Go and the systems it runs on both broad and deep. When I have general design questions or need to better understand details of the compiler, linker, or runtime, I turn to them." It's important to remember that tech lead — like any position of leadership — is a service role, not an honorary title. I have been leading the Go project for over 12 years, serving all of you, and trying to create the right conditions for all of you to do your best work. Large projects like Go absolutely benefit from stable leadership, but they can also benefit from leadership changes. New leaders bring new strengths and fresh perspectives. For Go, I think 12+ years of one leader is enough stability; it's time for someone new to serve in this role.

In particular, I don't believe that the "BDFL" (benevolent dictator for life) model is healthy for a person or a project. It doesn't create space for new leaders. It's a single point of failure. It doesn't give the project room to grow. I think Python benefited greatly from Guido stepping down in 2018 and letting other people lead, and I've had in the back of my mind for many years that we should have a Go leadership change eventually....

I am going to consciously step back from decision making and create space for Austin and the others to step forward, but I am not disappearing. I will still be available to talk about Go designs, review CLs, answer obscure history questions, and generally help and support you all in whatever way I can. I will still file issues and send CLs from time to time, I have been working on a few potential new standard libraries, I will still advocate for Go across the industry, and I will be speaking about Go at GoLab in Italy in November...

I am incredibly proud of the work we have all accomplished together, and I am confident in the leaders both on the Go team at Google and in the Go community. You are all doing remarkable work, and I know you will continue to do that.

Power

Could AI Speed Up the Design of Nuclear Reactors? (byu.edu) 156

A professor at Brigham Young University "has figured out a way to shave critical years off the complicated design and licensing processes for modern nuclear reactors," according to an announcement from the university.

"AI is teaming up with nuclear power." The typical time frame and cost to license a new nuclear reactor design in the United States is roughly 20 years and $1 billion. To then build that reactor requires an additional five years and between $5 and $30 billion. By using AI in the time-consuming computational design process, [chemical engineering professor Matt] Memmott estimates a decade or more could be cut off the overall timeline, saving millions and millions of dollars in the process — which should prove critical given the nation's looming energy needs.... "Being able to reduce the time and cost to produce and license nuclear reactors will make that power cheaper and a more viable option for environmentally friendly power to meet the future demand...."

Engineers deal with elements from neutrons on the quantum scale all the way up to coolant flow and heat transfer on the macro scale. [Memmott] also said there are multiple layers of physics that are "tightly coupled" in that process: the movement of neutrons is tightly coupled to the heat transfer which is tightly coupled to materials which is tightly coupled to the corrosion which is coupled to the coolant flow. "A lot of these reactor design problems are so massive and involve so much data that it takes months of teams of people working together to resolve the issues," he said... Memmott's is finding AI can reduce that heavy time burden and lead to more power production to not only meet rising demands, but to also keep power costs down for general consumers...

Technically speaking, Memmott's research proves the concept of replacing a portion of the required thermal hydraulic and neutronics simulations with a trained machine learning model to predict temperature profiles based on geometric reactor parameters that are variable, and then optimizing those parameters. The result would create an optimal nuclear reactor design at a fraction of the computational expense required by traditional design methods. For his research, he and BYU colleagues built a dozen machine learning algorithms to examine their ability to process the simulated data needed in designing a reactor. They identified the top three algorithms, then refined the parameters until they found one that worked really well and could handle a preliminary data set as a proof of concept. It worked (and they published a paper on it) so they took the model and (for a second paper) put it to the test on a very difficult nuclear design problem: optimal nuclear shield design.

The resulting papers, recently published in academic journal Nuclear Engineering and Design, showed that their refined model can geometrically optimize the design elements much faster than the traditional method.

In two days Memmott's AI algorithm determined an optimal nuclear-reactor shield design that took a real-world molten salt reactor company spent six months. "Of course, humans still ultimately make the final design decisions and carry out all the safety assessments," Memmott says in the announcement, "but it saves a significant amount of time at the front end....

"Our demand for electricity is going to skyrocket in years to come and we need to figure out how to produce additional power quickly. The only baseload power we can make in the Gigawatt quantities needed that is completely emissions free is nuclear power."

Thanks to long-time Slashdot reader schwit1 for sharing the article.
Medicine

US Prepares For Bird Flu Pandemic With $176 Million Moderna Vaccine Deal 184

An anonymous reader quotes a report from Ars Technica: The US government will pay Moderna $176 million to develop an mRNA vaccine against a pandemic influenza -- an award given as the highly pathogenic bird flu virus H5N1 continues to spread widely among US dairy cattle. The funding flows through BARDA, the Biomedical Advanced Research and Development Authority, as part of a new Rapid Response Partnership Vehicle (RRPV) Consortium. The program is intended to set up partnerships with industry to help the country better prepare for pandemic threats and develop medical countermeasures, the Department of Health and Human Services said in a press announcement Tuesday.

In its own announcement on Tuesday, Moderna noted that it began a Phase 1/2 trial of a pandemic influenza virus vaccine last year, which included versions targeting H5 and H7 varieties of bird flu viruses. The company said it expects to release the results of that trial this year and that those results will direct the design of a Phase 3 trial, anticipated to begin in 2025. The funding deal will support late-stage development of a "pre-pandemic vaccine against H5 influenza virus," Moderna said. But, the deal also includes options for additional vaccine development in case other public health threats arise.

US health officials have said previously that they were in talks with Moderna and Pfizer about the development of a pandemic bird flu vaccine. The future vaccine will be in addition to standard protein-based bird flu vaccines that are already developed. In recent weeks, the health department has said it is working to manufacture 4.8 million vials of H5 influenza vaccine in the coming months. The plans come three months into the H5N1 dairy outbreak, which is very far from the initial hopes of containment. [...] The more the virus expands its footprint across US dairy farms, adapts to its newfound mammalian host, and comes in contact with humans, the more and more chances it has to leap to humans and gain the ability to spread among us.
"The award made today is part of our longstanding commitment to strengthen our preparedness for pandemic influenza," said Dawn O'Connell, assistant secretary for Preparedness and Response. "Adding this technology to our pandemic flu toolkit enhances our ability to be nimble and quick against the circulating strains and their potential variants."

In a separate article, Ars Technica reports on a small study in Texas that suggests human cases are going undetected on dairy farms where the H5N1 virus has spread in cows.
AI

Perplexity AI Will Share Revenue With Publishers After Plagiarism Accusations (cnbc.com) 11

An anonymous reader quotes a report from CNBC: Perplexity AI on Tuesday debuted a revenue-sharing model for publishers after more than a month of plagiarism accusations. Media outlets and content platforms including Fortune, Time, Entrepreneur, The Texas Tribune, Der Spiegel and WordPress.com are the first to join the company's "Publishers Program." The announcement follows an onslaught of controversy in June, when Forbes said it found a plagiarized version of its paywalled original reporting within Perplexity AI's Pages tool, with no reference to the media outlet besides a small "F" logo at the bottom of the page. Weeks later, Wired said it also found evidence of Perplexity plagiarizing Wired stories, and reported that an IP address "almost certainly linked to Perplexity and not listed in its public IP range" visited its parent company's websites more than 800 times in a three-month span.

Under the new partner program, any time a user asks a question and Perplexity generates advertising revenue from citing one of the publisher's articles in its answer, Perplexity will share a flat percentage of that revenue. That percentage counts on a per-article basis, Dmitry Shevelenko, Perplexity's chief business officer, told CNBC in an interview -- meaning that if three articles from one publisher were used in one answer, the partner would receive "triple the revenue share." Shevelenko confirmed that the flat rate is a double-digit percentage but declined to provide specifics. Shevelenko told CNBC that more than a dozen publishers, including "major newspaper dailies and companies that own them," had reached out with interest less than two hours after the program debuted. The company's goal, he said, is to have 30 publishers enrolled by the end of the year, and Perplexity is looking to partner with some of the publishers' ad sales teams so they can sell ads "against all Perplexity inventory."

"When Perplexity earns revenue from an interaction where a publisher's content is referenced, that publisher will also earn a share," Perplexity wrote in a blog post, adding that the company will offer publishers API credits and also work with ScalePost.ai to provide analytics to provide "deeper insights into how Perplexity cites their content." Shevelenko told CNBC that Perplexity began engaging with publishers in January and solidified ideas for how its revenue-sharing program would work later in the first quarter of 2024. He said five Perplexity employees were dedicated to working on the program. "Some of it grew out of conversations we were having with publishers about integrating Perplexity APIs and technology into their products," Shevelenko said.

Google

W3C Slams Google U-turn on Third-Party Cookie Removal (w3.org) 26

The World Wide Web Consortium (W3C) has expressed disappointment with Google's decision to retain third-party cookies, stating it undermines collaborative efforts. Google's reversal follows a five-year initiative to develop privacy-focused ad technology. While some advertising industry representatives welcomed the move, the W3C's criticism highlights the ongoing debate over online privacy and advertising practices. W3C writes: Third-party cookies are not good for the web. They enable tracking, which involves following your activity across multiple websites. They can be helpful for use cases like login and single sign-on, or putting shopping choices into a cart -- but they can also be used to invisibly track your browsing activity across sites for surveillance or ad-targeting purposes. This hidden personal data collection hurts everyone's privacy.

We aren't the only ones who are worried. The updated RFC that defines cookies says that third-party cookies have "inherent privacy issues" and that therefore web "resources cannot rely upon third-party cookies being treated consistently by user agents for the foreseeable future." We agree. Furthermore, tracking and subsequent data collection and brokerage can support micro-targeting of political messages, which can have a detrimental impact on society, as identified by Privacy International and other organizations. Regulatory authorities, such as the UK's Information Commissioner's Office, have also called for the blocking of third-party cookies.

The job of the TAG as stewards of the architecture of the web has us looking at the big picture (the whole web platform) and the details (proposed features and specs). We try to provide guidance to spec authors so that their new technologies fill holes that need to be filled, don't conflict with other parts of the web, and don't set us up for avoidable trouble in the future. We've been working with Chrome's Privacy Sandbox team (as well as others in the W3C community) for several years, trying to help them create better approaches for the things that third-party cookies do. While we haven't always agreed with the Privacy Sandbox team, we have made substantial progress together. This announcement came out of the blue, and undermines a lot of the work we've done together to make the web work without third-party cookies.

The unfortunate climb-down will also have secondary effects, as it is likely to delay cross-browser work on effective alternatives to third-party cookies. We fear it will have an overall detrimental impact on the cause of improving privacy on the web. We sincerely hope that Google reverses this decision and re-commits to a path towards removal of third-party cookies.

Open Source

Mike McQuaid on 15 Years of Homebrew and Protecting Open-Source Maintainers (thenextweb.com) 37

Despite multiple methods available across major operating systems for installing and updating applications, there remains "no real clear answer to 'which is best,'" reports The Next Web. Each system faces unique challenges such as outdated packages, high fees, and policy restrictions.

Enter Homebrew.

"Initially created as an option for developers to keep the dependencies they often need for developing, testing, and running their work, Homebrew has grown to be so much more in its 15-year history." Created in 2009, Homebrew has become a leading solution for macOS, integrating with MDM tools through its enterprise-focused extension, Workbrew, to balance user freedom with corporate security needs, while maintaining its open-source roots under the guidance of Mike McQuaid. In an interview with The Next Web's Chris Chinchilla, project leader Mike McQuaid talks about the challenges and responsibilities of maintaining one of the world's largest open-source projects: As with anything that attracts plenty of use and attention, Homebrew also attracts a lot of mixed and extreme opinions, and processing and filtering those requires a tough outlook, something that Mike has spoken about in numerous interviews and at conferences. "As a large project, you get a lot of hate from people. Either people are just frustrated because they hit a bug or because you changed something, and they didn't read the release notes, and now something's broken," Mike says when I ask him about how he copes with the constant influx of communication. "There are a lot of entitled, noisy users in open source who contribute very little and like to shout at people and make them feel bad. One of my strengths is that I have very little time for those people, and I just insta-block them or close their issues."

More crucially, an open-source project is often managed and maintained by a group of people. Homebrew has several dozen maintainers and nearly one thousand total contributors. Mike explains that all of these people also deserve to be treated with respect by users, "I'm also super protective of my maintainers, and I don't want them to be treated that way either." But despite these features and its widespread use, one area Homebrew has always lacked is the ability to work well with teams of users. This is where Workbrew, a company Mike founded with two other Homebrew maintainers, steps in. [...] Workbrew ties together various Homebrew features with custom glue to create a workflow for setting up and maintaining Mac machines. It adds new features that core Homebrew maintainers had no interest in adding, such as admin and reporting dashboards for a computing fleet, while bringing more general improvements to the core project.

Bearing in mind Mike's motivation to keep Homebrew in the "traditional open source" model, I asked him how he intended to keep the needs of the project and the business separated and satisfied. "We've seen a lot of churn in the last few years from companies that made licensing decisions five or ten years ago, which have now changed quite dramatically and have generated quite a lot of community backlash," Mike said. "I'm very sensitive to that, and I am a little bit of an open-source purist in that I still consider the open-source initiative's definition of open source to be what open source means. If you don't comply with that, then you can be another thing, but I think you're probably not open source."

And regarding keeping his and his co-founder's dual roles separated, Mike states, "I'm the CTO and co-founder of Workbrew, and I'm the project leader of Homebrew. The project leader with Homebrew is an elected position." Every year, the maintainers and the community elect a candidate. "But then, with the Homebrew maintainers working with us on Workbrew, one of the things I say is that when we're working on Workbrew, I'm your boss now, but when we work on Homebrew, I'm not your boss," Mike adds. "If you think I'm saying something and it's a bad idea, you tell me it's a bad idea, right?" The company is keeping its early progress in a private beta for now, but you can expect an announcement soon. As for what's happening for Homebrew? Well, in the best "open source" way, that's up to the community and always will be.

United Kingdom

UK Plans Wind Energy Expansion with New Government-Owned Energy Company (bnnbloomberg.ca) 32

The U.K. government "will substantially increase offshore wind investment in the next five years," writes long-time Slashdot reader shilly — "in partnership with the Crown Estate (a public corporation that owns land including the coastal seabed on behalf of the monarch)." It will do this via its new state-owned energy generation [and investment] company, Great British Energy. The new approach includes ensuring grid connections are in place, and is in tandem with changes to the UK's planning regime that should reduce the ability of NIMBY groups to prevent infrastructure build-outs. Since [the Labour Party] came to power 20 days ago, the government has also approved three new solar farms and reversed a ban on onshore wind.
Labour Prime Minister Keir Starmer said in a speech Thursday that "I don't just want to be in the race for clean energy; I want us to win the race for clean energy," according to an article by BNN Bloomberg: Thursday's announcement marks the first concrete step by the government to use Great British Energy in its quest for a zero-carbon electric grid by 2030. The collaboration with the Crown Estate, owners of the UK's seabed, means the public sector will get involved in projects earlier and may attract more private funding... Great British Energy is receiving £8.3 billion of taxpayer money to own and operate assets in collaboration with the private sector.
The article points out that "By allowing borrowing, the government believes 20-30 gigawatts of new offshore wind seabed leases can be secured by 2030."

As Prime Minister Keir Starmer said in his speech, "We've got the potential, we've got the ports, we've got the people, the skills."
NASA

Proposed NASA Budget Cuts Would End Chandra X-Ray Observatory (spacenews.com) 81

A NASA committee determined that the Chandra X-ray Observatory would have to cease operations under the proposed budget cuts in NASA's 2025 budget. The committee reviewed various options but found that only shutting down Chandra fit within the proposed budget, although alternatives could keep the observatory running with limited capabilities. SpaceNews reports: NASA established the Operations Paradigm Change Review (OPCR) committee this spring to look at ways of reducing the costs of operating Chandra and the Hubble Space Telescope as part of broader efforts to deal with a billion-dollar shortfall in agency science funding. The fiscal year 2025 budget proposal included a 40% cut in Chandra's budget, with further reductions through 2029, while cutting Hubble's budget by 10% in 2025. Astronomers strongly opposed the proposed cuts, particularly for Chandra. They argued that the reductions would effectively shut down the telescope, a conclusion backed by Patrick Slane, director of the Chandra X-Ray Center, in an open letter shortly after the release of the budget proposal.

The OPCR concurred. "The committee agreed that the continuation of a scientifically viable Chandra mission is not possible within the funding guidance," said Rob Kennicutt, an astronomer from the University of Arizona and Texas A&M University who served on the review committee, in a July 23 presentation at a meeting of the Astrophysics Advisory Committee, or APAC. "This is a serious threat to the observatory." Shutting down Chandra was one of four options presented to the OPCR by the Chandra team and the only one, he said, that fit within NASA's proposed budget profile. Three others would keep Chandra going with reduced capabilities and with budgets higher than what NASA proposed but below current levels. "We think it's possible to run Chandra for less money" than today, he said, "but more than what they were given."

Microsoft

World of Warcraft Developers Form Blizzard's Largest and Most Inclusive Union (theverge.com) 37

Ash Parrish reports via The Verge: More than 500 developers at Blizzard Entertainment who work on World of Warcraft have voted to form a union. The World of Warcraft GameMakers Guild, formed with the assistance of the Communication Workers of America (CWA), is composed of employees across every department, including designers, engineers, artists, producers, and more. Together, they have formed the largest wall-to-wall union -- or a union inclusive of multiple departments and disciplines -- at Microsoft. This news comes less than a week after the formation of the Bethesda Game Studios union, which, at the time of the announcement, was itself the largest wall-to-wall Microsoft union. [...]

The World of Warcraft GameMakers Guild is made up of over 500 members across Blizzard offices in California and Massachusetts. Despite its size -- it is the second largest union at Microsoft overall behind Activision's 600-member QA union -- [Paul Cox, senior quest designer and Blizzard veteran] said that Microsoft's labor neutrality agreement helped get the organization ball rolling.
In a statement to The Verge, Microsoft spokesperson Delaney Simmons said, "We continue to support our employees' right to choose how they are represented in the workplace, and we will engage in good faith negotiations with the CWA as we work towards a collective bargaining agreement."

Slashdot Top Deals