Robotics

Nvidia's New 'Robot Brain' Goes On Sale (cnbc.com) 33

Nvidia has launched its Jetson AGX Thor robotics chip module, a $3,499 "robot brain" developer kit that starts shipping next month. CNBC reports: After a company uses the developer kit to prototype their robot, Nvidia will sell Thor T5000 modules that can be installed in production-ready robots. If a company needs more than 1,000 Thor chips, Nvidia will charge $2,999 per module. CEO Jensen Huang has said robotics is the company's largest growth opportunity outside of artificial intelligence, which has led to Nvidia's overall sales more than tripling in the past two years. "We do not build robots, we do not build cars, but we enable the whole industry with our infrastructure computers and the associated software," said Deepu Talla, Nvidia's vice president of robotics and edge AI, on a call with reporters Friday.

The Jetson Thor chips are based on a Blackwell graphics processor, which is Nvidia's current generation of technology used in its AI chips, as well as its chips for computer games. Nvidia said that its Jetson Thor chips are 7.5 times faster than its previous generation. That allows them to run generative AI models, including large language models and visual models that can interpret the world around them, which is essential for humanoid robots, Nvidia said. The Jetson Thor chips are equipped with 128GB of memory, which is essential for big AI models. [...] The company said its Jetson Thor chips can be used for self-driving cars as well, especially from Chinese brands. Nvidia calls its car chips Drive AGX, and while they are similar to its robotics chips, they run an operating system called Drive OS that's been tuned for automotive purposes.

Operating Systems

Linux 6.16 Brings Faster File Systems, Improved Confidential Memory Support, and More Rust Support (zdnet.com) 50

ZDNet's Steven Vaughan-Nichols shares his list of "what's new and improved" in the latest Linux 6.16 kernel. An anonymous reader shares an excerpt from the report: First, the Rust language is continuing to become more well-integrated into the kernel. At the top of my list is that the kernel now boasts Rust bindings for the driver core and PCI device subsystem. This approach will make it easier to add new Rust-based hardware drivers to Linux. Additionally, new Rust abstractions have been integrated into the Direct Rendering Manager (DRM), particularly for ioctl handling, file/GEM memory management, and driver/device infrastructure for major GPU vendors, such as AMD, Nvidia, and Intel. These changes should reduce vulnerabilities and optimize graphics performance. This will make gamers and AI/ML developers happier.

Linux 6.16 also brings general improvements to Rust crate support. Crate is Rust's packaging format. This will make it easier to build, maintain, and integrate Rust kernel modules into the kernel. For those of you who still love C, don't worry. The vast majority of kernel code remains in C, and Rust is unlikely to replace C soon. In a decade, we may be telling another story. Beyond Rust, this latest release also comes with several major file system improvements. For starters, the XFS filesystem now supports large atomic writes. This capability means that large multi-block write operations are 'atomic,' meaning all blocks are updated or none. This enhances data integrity and prevents data write errors. This move is significant for companies that use XFS for databases and large-scale storage.

Perhaps the most popular Linux file system, Ext4, is also getting many improvements. These boosts include faster commit paths, large folio support, and atomic multi-fsblock writes for bigalloc filesystems. What these improvements mean, if you're not a file-system nerd, is that we should see speedups of up to 37% for sequential I/O workloads. If your Linux laptop doubles as a music player, another nice new feature is that you can now stream your audio over USB even while the rest of your system is asleep. That capability's been available in Android for a while, but now it's part of mainline Linux.

If security is a top priority for you, the 6.16 kernel now supports Intel Trusted Execution Technology (TXT) and Intel Trusted Domain Extensions (TDX). This addition, along with Linux's improved support for AMD Secure Encrypted Virtualization and Secure Memory Encryption (SEV-SNP), enables you to encrypt your software's memory in what's known as confidential computing. This feature improves cloud security by encrypting a user's virtual machine memory, meaning someone who cracks a cloud can't access your data.
Linux 6.16 also delivers several chip-related upgrades. It introduces support for Intel's Advanced Performance Extensions (APX), doubling x86 general-purpose registers from 16 to 32 and boosting performance on next-gen CPUs like Lunar Lake and Granite Rapids Xeon. Additionally, the new CONFIG_X86_NATIVE_CPU option allows users to build processor-optimized kernels for greater efficiency.

Support for Nvidia's AI-focused Blackwell GPUs has also been improved, and updates to TCP/IP with DMABUF help offload networking tasks to GPUs and accelerators. While these changes may go unnoticed by everyday users, high-performance systems will see gains and OpenVPN users may finally experience speeds that challenge WireGuard.
Firefox

Mozilla Ships WebGPU in Firefox 141, Catching Up To Chrome's 2023 Launch (wordpress.com) 20

Mozilla will ship WebGPU support in Firefox 141 when the browser launches July 22, bringing graphics processing capabilities that Chrome users have had since 2023. The initial release supports Windows only, with Mac, Linux, and Android planned for the coming months.

WebGPU provides web content direct access to graphics processors for high-performance computation and rendering in games and complex 3D applications. Chrome gained WebGPU support with version 113 in 2023, while Safari 26 is expected to add the feature this fall. Firefox's implementation uses the WGPU Rust crate, which translates web requests into native commands for Direct3D 12, Metal, or Vulkan.
Software

Blender 4.5 LTS Released (nerds.xyz) 11

BrianFagioli shares a report from NERDS.xyz: Blender 4.5 has arrived and it's a long-term support release. That means users get two full years of updates and bug fixes, making it a smart choice for anyone looking for stability in serious projects. Whether you're a solo artist or part of a studio pipeline, this version is built to last. Here's a list of key features and changes in this release:

- Vulkan backend replaces OpenGL (faster, smoother UI)
- Adaptive subdivision up to 14x faster with multithreading
- New Geometry Nodes: Camera Info, Instance Bounds
- GPU-accelerated compositor nodes with standardized inputs
- New Boolean solver: Manifold (cleaner, faster mesh operations)
- UV maps visible in Object Mode + improved selection behavior
- Grease Pencil render pass and Geometry Nodes integration
- Improved file import support: PLY, OBJ, STL, CSV, VDB
- Deprecations: Collada, Big Endian, legacy .blend, Intel Mac support
- Cycles OptiX now requires NVIDIA driver v535+
- New shader variants for add-on developers (POLYLINES_*, POINT_*)
~500 bug fixes across all major systems
Graphics

Blender Studio Releases Free New Game 'Dogwalk' to Showcase Its Open Source Godot Game Engine (notebookcheck.net) 25

"Steam quietly welcomed another indie game this week, but this one is distinctly different for a lot of reasons," writes Notebookcheck: Dogwalk, which debuted on July 11, is the kind of short, gentle experience that almost forces you to smile. Developed by Blender Studio, the game introduces players to a gorgeous winter landscape. You play as a cute, fluffy dog, with a small child in tow...

What's particularly interesting here is that Dogwalk is more than just another charming indie project. It's Blender Studio's showcase for what's possible using fully open-source tools. The entire project — assets, animations, and code — is made with Blender and the popular Godot Game Engine. Unlike industry giants such as Unity or Unreal, Godot is completely open source, meaning it doesn't require developers to pay royalties or follow strict licensing agreements. This should make it great for small studios and independent creators, as it lowers the entry barrier to game creation.

Dogwalk is 100% free, which fits neatly into its open-source philosophy

Earth

More Than Half of Carbon Credit Auditors Have Signed Off on 'Overclaimed' Benefits (science.org) 55

Can carbon-reducing projects "offset" a company's emissions? "The reality has been less encouraging," according to a Science magazine editorial by Cary Coglianese, a law/political science professor at University of Pennsylvania, and Cynthia Giles, a former senior advisor at the U.S. Environmental Protection Agency. In a new paper published Wednesday, they found that more than half of all currently-certified carbon auditors signed off on projects later found to be "overclaiming" carbon benefits.

Their conclusion? "Criticism should be directed not at individual auditors as much as the structure of the system that fosters these outcomes." Most carbon offset projects that have been closely scrutinized — including projects for forest protection, renewable energy, and methane-reducing methods of rice cultivation — have greatly exaggerated their climate benefits. More than 80% of issued credits might not reflect real emission reductions. This has alarmed potential offset purchasers and stalled carbon offset markets.

Efforts to resuscitate the beleaguered offset market tout third-party auditing as "essential" to ensuring credit integrity. That reliance is misplaced... [E]xtensive research from many contexts shows that auditors selected and paid by audited organizations often produce results skewed toward those entities' interests. A field experiment in India, for example, found that air and water pollution auditors who were randomly assigned and paid from a central fund reported emissions at levels 50 to 70% higher than auditors selected and paid by audited firms. Auditors — like all people — are subject to a well-established and largely unconscious cognitive phenomenon of self-serving bias, causing them to interpret evidence in favor of their clients...

[A]uditors have been required all along and have failed to prevent substantial credit overclaiming. It is rarely acknowledged that all of the credit overclaiming projects that have stirred so much controversy were ratified by third-party auditors under the same auditor selection and payment system that offset advocates rely on today... Auditors are unlikely to stay in business if they disapprove credits at the high rates that research suggests would be appropriate today...

Given the high planetary stakes in carbon policy choices being made now, it is past time to recognize that third-party auditors selected and paid by the audited organizations are not the bulwark for credit integrity they are claimed to be.

Bug

NVIDIA Warns Its High-End GPUs May Be Vulnerable to Rowhammer Attacks (nerds.xyz) 15

Slashdot reader BrianFagioli shared this report from Nerds.xyz: NVIDIA just put out a new security notice, and if you're running one of its powerful GPUs, you might want to pay attention. Researchers from the University of Toronto have shown that Rowhammer attacks, which are already known to affect regular DRAM, can now target GDDR6 memory on NVIDIA's high-end GPUs when ECC [error correction code] is not enabled.

They pulled this off using an A6000 card, and it worked because system-level ECC was turned off. Once it was switched on, the attack no longer worked. That tells you everything you need to know. ECC matters.

Rowhammer has been around for years. It's one of those weird memory bugs where repeatedly accessing one row in RAM can cause bits to flip in another row. Until now, this was mostly a CPU memory problem. But this research shows it can also be a GPU problem, and that should make data center admins and workstation users pause for a second.

NVIDIA is not sounding an alarm so much as reminding everyone that protections are already in place, but only if you're using the hardware properly. The company recommends enabling ECC if your GPU supports it. That includes cards in the Blackwell, Hopper, Ada, and Ampere lines, along with others used in DGX, HGX, and Jetson systems. It also includes popular workstation cards like the RTX A6000.

There's also built-in On-Die ECC in certain newer memory types like GDDR7 and HBM3. If you're lucky enough to be using a card that has it, you're automatically protected to some extent, because OD-ECC can't be turned off. It's always working in the background. But let's be real. A lot of people skip ECC because it can impact performance or because they're running a setup that doesn't make it obvious whether ECC is on or off. If you're not sure where you stand, it's time to check. NVIDIA suggests using tools like nvidia-smi or, if you're in a managed enterprise setup, working with your system's BMC or Redfish APIs to verify settings.

Businesses

Nvidia Hits $4 Trillion Market Cap, First Company To Do So (cnbc.com) 25

Nvidia shares jumped more than 2% on Wednesday, topping a $4 trillion market cap for the first time as investors scooped stock in the tech giant building the hardware for the generative AI boom. From a report: The chipmaker is the first company to ever achieve this market value. Nvidia is the world's most valuable company, surpassing Microsoft and Apple, both of which hit the $3 trillion mark before Nvidia. Microsoft is also one of Nvidia's biggest and most important customers. The California-based company, which was founded in 1993, first passed the $2 trillion mark in February 2024, and surpassed $3 trillion in June.

Nvidia has profited heavily off of growing demand for artificial intelligence hardware and chips since the launch of ChatGPT in late 2022. The company has positioned itself as the decisive leader in the creating the graphics processing units that power large language models.

Graphics

Graphics Artists In China Push Back On AI and Its Averaging Effect (theverge.com) 33

Graphic artists in China are pushing back against AI image generators, which they say "profoundly shifts clients' perception of their work, specifically in terms of how much that work costs and how much time it takes to produce," reports The Verge. "Freelance artists or designers working in industries with clients that invest in stylized, eye-catching graphics, like advertising, are particularly at risk." From the report: Long before AI image generators became popular, graphic designers at major tech companies and in-house designers for large corporate clients were often instructed by managers to crib aesthetics from competitors or from social media, according to one employee at a major online shopping platform in China, who asked to remain anonymous for fear of retaliation from their employer. Where a human would need to understand and reverse engineer a distinctive style to recreate it, AI image generators simply create randomized mutations of it. Often, the results will look like obvious copies and include errors, but other graphic designers can then edit them into a final product.

"I think it'd be easier to replace me if I didn't embrace [AI]," the shopping platform employee says. Early on, as tools like Stable Diffusion and Midjourney became more popular, their colleagues who spoke English well were selected to study AI image generators to increase in-house expertise on how to write successful prompts and identify what types of tasks AI was useful for. Ultimately, it was useful for copying styles from popular artists that, in the past, would take more time to study. "I think it forces both designers and clients to rethink the value of designers," Jia says. "Is it just about producing a design? Or is it about consultation, creativity, strategy, direction, and aesthetic?" [...]

Across the board, though, artists and designers say that AI hype has negatively impacted clients' view of their work's value. Now, clients expect a graphic designer to produce work on a shorter timeframe and for less money, which also has its own averaging impact, lowering the ceiling for what designers can deliver. As clients lower budgets and squish timelines, the quality of the designers' output decreases. "There is now a significant misperception about the workload of designers," [says Erbing, a graphic designer in Beijing who has worked with several ad agencies and asked to be called by his nickname]. "Some clients think that since AI must have improved efficiency, they can halve their budget." But this perception runs contrary to what designers spend the majority of their time doing, which is not necessarily just making any image, Erbing says.

Software

Blender 5.0 Introducing HDR Support On Linux With Vulkan + Wayland (phoronix.com) 24

Michael Larabel writes via Phoronix: The upcoming Blender 5.0 3D modeling software application is introducing High Dynamic Range (HDR) display support on Linux when making use of Wayland -- no X11 support for HDR -- and Vulkan graphics accelerator. HDR support for Blender 5.0 on Linux is currently considered experimental. Enabling the HDR support on Linux for the Blender creator software requires having a High Dynamic Range display (of course) and be running on a Wayland desktop, enabling Vulkan API acceleration rather than OpenGL, and enabling the feature currently deemed experimental. Additional details can be found via this Blender DevTalk thread.
Windows

Games Run Faster On SteamOS Than Windows 11, Ars Testing Finds (arstechnica.com) 102

An anonymous reader quotes a report from Ars Technica: Nearly a decade ago, Ars testing found that Valve's "Steam Machines"-era version of SteamOS performed significantly worse than Windows when SteamOS's Linux game ports were tested on the same hardware as their Windows counterparts. Today, though, Ars testing on the Lenovo Legion Go S finds recent games generally run at higher frame rates on SteamOS 3.7 than on Windows 11. [...]

As you can see in the included charts, SteamOS showed noticeable frame rate improvements in four of the five games tested. Only Borderlands 3 showed comparable performance across both operating systems, with Windows eking out ever-so-slightly higher frame rates in that game's benchmarks. For the other four tested games, the stock Lenovo Windows drivers were sometimes significantly worse than those included with SteamOS. When playing Returnal at "High" graphics presets and 1920x1200 resolution, for instance, changing from Lenovo's Windows drivers to SteamOS meant the difference between a hard-to-take 18 FPS average and a downright decent 33 FPS average. Sideloading the updated Asus drivers showed a noticeable improvement in Windows performance across all tested games and even brought Homeworld 3's "Low" graphics benchmark test to practical parity with SteamOS. In all other cases, though, even these updated drivers resulted in benchmark frame rates anywhere from 8 percent to 36 percent lower than those same benchmarks on SteamOS.

These results might seem a bit counterintuitive, considering that games running on SteamOS must go through a Proton translation layer for every native Windows instruction in a game's code. But Valve has put in consistent work over the years to make Proton as efficient and cross-compatible as possible; not to mention its continued work on Linux's Mesa graphics drivers seems to be paying dividends for SteamOS graphics performance. Running SteamOS also means eliminating a lot of operating system overhead that the more generalist Windows uses by default. Microsoft seems aware of this issue for gamers and has recently announced that the upcoming "Xbox Experience for Handheld" will "minimize background activity and defer non-essential tasks" to allow for "more [and] higher framerates" in games.

Ubuntu

Ubuntu To Disable Intel Graphics Security Mitigations To Boost GPU Performance By Up To 20% (arstechnica.com) 15

Disabling Intel graphics security mitigations in GPU compute stacks for OpenCL and Level Zero can yield a performance boost of up to 20%, prompting Ubuntu's Canonical and Intel to disable these mitigations in future Ubuntu packages. Phoronix's Michael Larabel reports: Intel does allow building their GPU compute stack without these mitigations by using the "NEO_DISABLE_MITIGATIONS" build option and that is what Canonical is looking to set now for Ubuntu packages to avoid the significant performance impact. This work will likely all be addressed in time for Ubuntu 25.10. This NEO_DISABLE_MITIGATIONS option is just for compiling the Intel Compute Runtime stack and doesn't impact the Linux kernel security mitigations or else outside of Intel's "NEO" GPU compute stack. Both Intel and Canonical are in agreement with this move and it turns out that even Intel's GitHub binary packages for their Compute Runtime for OpenCL and Level Zero ship with the mitigations disabled due to the performance impact. This Ubuntu Launchpad bug report for the Intel Compute Runtime notes some of the key takeaways. There is also this PPA where Ubuntu developers are currently testing their Compute Runtime builds with NEO_DISABLE_MITIGATIONS enabled for disabling the mitigations.
KDE

KDE Plasma 6.4 Released (kde.org) 29

Longtime Slashdot reader jrepin writes: Plasma is a popular desktop (and mobile) environment for GNU/Linux and other UNIX-like operating systems. Among other things, it also powers the desktop mode of the Steam Deck gaming handheld. The KDE community today announced the latest release: Plasma 6.4. This fresh new release improves on nearly every front, with progress being made in accessibility, color rendering, tablet support, window management, and more.

Plasma already offered virtual desktops and customizable tiles to help organize your windows and activities, and now it lets you choose a different configuration of tiles on each virtual desktop. The Wayland session brings some new accessibility features: you can now move the pointer using your keyboard's number pad keys, or use a three-finger touchpad pinch gesture to zoom in or out.

Plasma file transfer notification now shows a speed graph, giving you a more visual idea of how fast the transfer is going and how long it will take to complete. When any applications are in full screen mode Plasma will now enter Do Not Disturb mode and only show urgent notifications. When you exit full-screen mode, you'll see a summary of any notifications you missed.

Now, when an application tries to access the microphone and finds it muted, a notification will pop up. A new feature in the Application Launcher widget will place a green New! tag next to newly installed apps, so you can easily find where something you just installed lives in the menu.

The Display and Monitor page in System Settings comes with a brand new HDR calibration wizard. Support for Extended Dynamic Range (a different kind of HDR) and P010 video color format has also been added. System Monitor now supports usage monitoring for AMD and Intel graphic cards -- it can even show the GPU usage on a per-process basis.

Spectacle, the built-in app for taking screenshots and screen recordings, has a much-improved design and more streamlined functionality. The background of the desktop or window now darkens when an authentication dialog shows up, helping you locate and focus on the window asking for your password.

There's a brand-new Animations page in System Settings that groups all the settings for purely visual animated effects into one place, making them easier to find and configure. Aurorae, a newly added SVG vector graphics theme engine, enhances KWin window decorations.

You can read more about these and many other other features in the Plasma 6.4 announcement and complete changelog.

Nintendo

Nintendo Switch 2 Has Record-Breaking Launch, Selling Over 3 Million Units (barrons.com) 48

TweakTown writes that the Switch 2 "has reportedly beaten the record for the most-sold console within 24 hours and is on track to shatter the two-month record," selling over 3 million units and tripling the PlayStation 4's previous launch day sales.

So Nintendo's first console in 8 years becomes "one of the most successful hardware releases of all time," writes Barron's, raising hopes for the future: [2017's original Switch] ultimately sold more than 152 million units... Switch 2's big advantage is its backward compatibility, allowing it to play current-generation Switch games and giving gamers solace that their large investments in software are intact... Many older Switch games also play better on the Switch 2, taking advantage of the extra horsepower.
Bloomberg writes that its bigger screen and faster chip "live up to the hype: Despite the hype and a $150 increase over the launch price for the original, the second-generation system manages to impress with faster performance, improved graphics, more comfortable ergonomics and enough tweaks throughout to make this feel like a distinctly new machine... This time, it's capable of outputting 4K resolution and more impactful HDR video to your TV screen... It's a bigger, faster, more polished version of a wildly successful gadget.
The "buzzy launch drew long lines" at retailers like Walmart, Target, Best Buy, and Gamestop, according to the article. (See the photos from AOL.com and USA Today.) "The era of spending hours waiting in line for the latest iPhone is long gone, but the debut of a new video game console is still a rare enough event that Nintendo fans didn't think twice about driving to retailers in the middle of the night to secure a Switch 2."

The Verge also opines that "the Switch 2's eShop is much better," calling it "way faster... with much less lag browsing through sections and loading up game pages."

Or, as Barron's puts it, "Ultimately, Nintendo is winning because it has a different strategy than its competition, the Sony PlayStation and Microsoft Xbox. Instead of trying to appeal to tech snobs like me, who are obsessed with graphics resolution and hardware statistics like teraflops, Nintendo focuses on joy and fun."
Hardware

Polish Engineer Creates Postage Stamp-Sized 1980s Atari Computer (arstechnica.com) 32

Ars Technica's Benj Edwards reports: In 1979, Atari released the Atari 400 and 800, groundbreaking home computers that included custom graphics and sound chips, four joystick ports, and the ability to run the most advanced home video games of their era. These machines, which retailed for $549 and $999, respectively, represented a leap in consumer-friendly personal computing, with their modular design and serial I/O bus that presaged USB. Now, 46 years later, a hobbyist has shrunk down the system hardware to a size that would have seemed like science fiction in the 1970s.

Polish engineer Piotr "Osa" Ostapowicz recently unveiled "Atarino," which may be the world's smallest 8-bit Atari computer re-creation, according to retro computing site Atariteca. The entire system -- processor, graphics chips, sound hardware, and memory controllers -- fits on a module measuring just 2x1.5 centimeters (about 0.79x0.59 inches), which is roughly the size of a postage stamp.

Ostapowicz's creation reimplements the classic Atari XL/XE architecture using modern FPGA (field-programmable gate array) technology. Unlike software emulators that simulate old hardware (and modern recreations that run them, like the Atari 400 Mini console) on a complete computer system of another architecture, Atarino reproduces the original Atari components faithfully at the logic level, allowing it to run vintage software while maintaining compatibility with original peripherals. [...] The project, which began over a decade ago and was first publicly demonstrated in December 2023, includes a 6502C processor, ANTIC and GTIA graphics chips, POKEY sound chip, and memory controllers onto a single Lattice UP5K FPGA chip. Despite its tiny size, the system can run at clock speeds up to 31 MHz -- far faster than the original hardware's 1.79 MHz.
While the Atarino can run vintage software and work with the original peripherals, it brings several key improvements -- including a modernized 6502 core with added instructions, a more efficient memory architecture, enhanced video output via VGA and HDMI, extended graphics modes, refined sound chip emulation, modular hardware design, support for modern connectivity like Wi-Fi and Ethernet, and compatibility with contemporary development tools like CC65 and Visual Studio Code.

Ostapowicz "plans to release complete kits with documentation, inviting the retrocomputing community to experiment with the hardware," adds Edwards.
Open Source

SerenityOS Creator Is Building an Independent, Standards-First Browser Called 'Ladybird' (thenewstack.io) 40

A year ago, the original creator of SerenityOS posted that "for the past two years, I've been almost entirely focused on Ladybird, a new web browser that started as a simple HTML viewer for SerenityOS." So it became a stand-alone project that "aims to render the modern web with good performance, stability and security." And they're also building a new web engine.

"We are building a brand-new browser from scratch, backed by a non-profit..." says Ladybird's official web site, adding that they're driven "by a web standards first approach." They promise it will be truly independent, with "no code from other browsers" (and no "default search engine" deals).

"We are targeting Summer 2026 for a first Alpha version on Linux and macOS. This will be aimed at developers and early adopters." More from the Ladybird FAQ: We currently have 7 paid full-time engineers working on Ladybird. There is also a large community of volunteer contributors... The focus of the Ladybird project is to build a new browser engine from the ground up. We don't use code from Blink, WebKit, Gecko, or any other browser engine...

For historical reasons, the browser uses various libraries from the SerenityOS project, which has a strong culture of writing everything from scratch. Now that Ladybird has forked from SerenityOS, it is no longer bound by this culture, and we will be making use of 3rd party libraries for common functionality (e.g image/audio/video formats, encryption, graphics, etc.) We are already using some of the same 3rd party libraries that other browsers use, but we will never adopt another browser engine instead of building our own...

We don't have anyone actively working on Windows support, and there are considerable changes required to make it work well outside a Unix-like environment. We would like to do Windows eventually, but it's not a priority at the moment.

"Ladybird's founder Andreas Kling has a solid background in WebKit-based C++ development with both Apple and Nokia,," writes software developer/author David Eastman: "You are likely reading this on a browser that is slightly faster because of my work," he wrote on his blog's introduction page. After leaving Apple, clearly burnt out, Kling found himself in need of something to healthily occupy his time. He could have chosen to learn needlepoint, but instead he opted to build his own operating system, called Serenity. Ladybird is a web project spin-off from this, to which Kling now devotes his time...

[B]eyond the extensive open source politics, the main reason for supporting other independent browser projects is to maintain diverse alternatives — to prevent the web platform from being entirely captured by one company. This is where Ladybird comes in. It doesn't have any commercial foundation and it doesn't seem to be waiting to grab a commercial opportunity. It has a range of sponsors, some of which might be strategic (for example, Shopify), but most are goodwill or alignment-led. If you sponsor Ladybird, it will put your logo on its webpage and say thank you. That's it. This might seem uncontroversial, but other nonprofit organisations also give board seats to high-paying sponsors. Ladybird explicitly refuses to do this...

The Acid3 Browser test (which has nothing whatsoever to do with ACID compliance in databases) is an old method of checking compliance with web standards, but vendors can still check how their products do against a battery of tests. They check compliance for the DOM2, CSS3, HTML4 and the other standards that make sure that webpages work in a predictable way. If I point my Chrome browser on my MacBook to http://acid3.acidtests.org/, it gets 94/100. Safari does a bit better, getting to 97/100. Ladybird reportedly passes all 100 tests.

"All the code is hosted on GitHub," says the Ladybird home page. "Clone it, build it, and join our Discord if you want to collaborate on it!"
Operating Systems

Valve Adds SteamOS Support For Its Steam Deck Rivals (polygon.com) 24

Valve's SteamOS 3.7.8 update brings official support for AMD-powered handhelds like Lenovo's Legion Go and Asus' ROG Ally, along with a new "Steam OS Compatible" library tab and key bug fixes. Other features include a battery charge limit, updated graphics drivers, and a shift to Plasma 6.2.5. Polygon reports: Valve outlines two requirements for the third-party devices not explicitly named in the update to run SteamOS on the handheld: they must be AMD-powered and have an NVMe SSD. Specific instructions for installing the operating system have been updated and listed here.

Before this huge update, players had to use an alternative like Bazzite to achieve a similar SteamOS experience on their devices. The new update also piggybacks off of Valve expanding the Steam Deck Verified categorization system to "any device running SteamOS that's not a Steam Deck" in mid-May. To make matters sweeter, a SteamOS-powered version of the Lenovo Legion Go S is scheduled to release on May 25.
You can learn more about SteamOS 3.7.8 here.
Graphics

Nvidia's RTX 5060 Review Debacle Should Be a Wake-Up Call (theverge.com) 67

Nvidia is facing backlash for allegedly manipulating the review process of its GeForce RTX 5060 GPU by withholding drivers, selectively granting early access to favorable reviewers, and pressuring media to present the card in a positive light. As The Verge's Sean Hollister writes, the debacle "should be a wake-up call for gamers and reviewers." Here's an excerpt from the report: Nvidia has gone too far. This week, the company reportedly attempted to delay, derail, and manipulate reviews of its $299 GeForce RTX 5060 graphics card, which would normally be its bestselling GPU of the generation. Nvidia has repeatedly and publicly said the budget 60-series cards are its most popular, and this year it reportedly tried to ensure it by withholding access and pressuring reviewers to paint them in the best light possible.

Nvidia might have wanted to prevent a repeat of 2022, when it launched this card's predecessor. Those reviews were harsh. The 4060 was called a "slap in the face to gamers" and a "wet fart of a GPU." I had guessed the 5060 was headed for the same fate after seeing how reviewers handled the 5080, which similarly showcased how little Nvidia's hardware has improved year over year and relies on software to make up the gaps. But Nvidia had other plans. Here are the tactics that Nvidia reportedly just used to throw us off the 5060's true scent, as individually described by GamersNexus, VideoCardz, Hardware Unboxed, GameStar.de, Digital Foundry, and more:

- Nvidia decided to launch its RTX 5060 on May 19th, when most reviewers would be at Computex in Taipei, Taiwan, rather than at their test beds at home.
- Even if reviewers already had a GPU in hand before then, Nvidia cut off most reviewers' ability to test the RTX 5060 before May 19th by refusing to provide drivers until the card went on sale. (Gaming GPUs don't really work without them.)
- And yet Nvidia allowed specific, cherry-picked reviewers to have early drivers anyhow if they agreed to a borderline unethical deal: they could only test five specific games, at 1080p resolution, with fixed graphics settings, against two weaker GPUs (the 3060 and 2060 Super) where the new card would be sure to win.
- In some cases, Nvidia threatened to withhold future access unless reviewers published apples-to-oranges benchmark charts showing how the RTX 5060's "fake frames" MFG tech can produce more frames than earlier GPUs without it.

Some reviewers apparently took Nvidia up on that proposition, leading to day-one "previews" where the charts looked positively stacked in the 5060's favor [...]. But the reality, according to reviews that have since hit the web, is that the RTX 5060 often fails to beat a four-year-old RTX 3060 Ti, frequently fails to beat a four-year-old 3070, and can sometimes get upstaged by Intel's cheaper $250 B580. And yet, the 5060's lackluster improvements are overshadowed by a juicier story: inexplicably, Nvidia decided to threaten GamersNexus' future access over its GPU coverage. Yes, the same GamersNexus that's developed a staunch reputation for defending consumers from predatory behavior, and just last month published a report on "GPU shrinkflation" that accused Nvidia of misleading marketing. Bad move! [...]

Nvidia is within its rights to withhold access, of course. Nvidia doesn't have to send out graphics cards or grant interviews. It'll only do it if it's good for business. But the unspoken covenant of product reviews is that the press, as a whole, gets a chance to warn the public if a movie, video game, or GPU is not worth their money. It works both ways: the media also gets the chance to warn that a product is so good you might want to line up in advance. That unspoken rule is what Nvidia is trampling here.

China

China's 7-Year Tech Independence Push Yields Major Gains in AI, Robotics and Semiconductors (msn.com) 84

China has achieved substantial technological advances across robotics, AI, and semiconductor manufacturing as part of a seven-year self-reliance campaign that has tripled the country's research and development spending to $500 billion annually.

Chinese robot manufacturers captured nearly half of their domestic market by 2023, up from a quarter of installations just years earlier, while AI startups now rival OpenAI and Google in capabilities. The progress extends to semiconductors, where Huawei released a high-end smartphone powered by what industry analysts believe was a locally-produced advanced processor, despite U.S. export controls targeting China's chip access.

Morgan Stanley projects China's self-sufficiency in graphics processing units will jump from 11% in 2021 to 82% by 2027. Chinese companies have been purchasing as many industrial robots as the rest of the world combined, enabling highly automated factories that can operate in darkness. In space technology, Chinese firms won five of 11 gold medals when U.S. think tanks ranked the world's best commercial satellite systems last year, compared to four for American companies.
Hardware

Nvidia Reportedly Raises GPU Prices by 10-15% (tomshardware.com) 63

An anonymous reader shares a report: A new report claims that Nvidia has recently raised the official prices of nearly all of its products to combat the impact of tariffs and surging manufacturing costs on its business, with gaming graphics cards receiving a 5 to 10% hike while AI GPUs see up to a 15% increase.

As reported by Digitimes Taiwan, Nvidia is facing "multiple crises," including a $5.5 billion hit to its quarterly earnings over export restrictions on AI chips, including a ban on sales of its H20 chips to China.

Digitimes reports that CEO Jensen Huang has been "shuttling back and forth" between the US and China to minimize the impact of tariffs, and that "in order to maintain stable profitability," Nvidia has reportedly recently raised official prices for almost all its products, allowing its partners to increase prices accordingly.

Slashdot Top Deals