AI

Figma's Big AI Update Takes On Adobe, WordPress, and Canva 10

At its Config 2025 event on Wednesday, Figma unveiled four new AI-powered tools -- Sites, Make, Buzz, and Draw, positioning itself as a full-stack design platform to rival Adobe, WordPress, and Canva. These tools enable users to build websites, generate code, create marketing content, and design vector graphics without leaving the Figma ecosystem. The Verge reports: Figma's first solution is Figma Sites, a website builder that integrates with Figma Design and allows creators to turn their projects into live, functional sites. Figma Sites provides presets for layouts, blocks, templates, and interactions that aim to make building websites less complex and time-consuming. Additional components like custom animations can also be added either using existing code or by prompting Site's AI tool to generate new interaction codes via text descriptions, such as "animate the text to fall into place like a feather." Figma Sites is rolling out in beta for users with full seat access to Figma products. Figma says that AI code generation will be available "in the coming weeks," and that a CMS that allows designers to manage site content will be launched "later this year."

Figma Make is Figma's take on AI coding tools like Google's Gemini Code Assist and Microsoft's GitHub Copilot. The prompt-to-code Figma Make tool is powered by Anthropic's Claude 3.7 model and can build working prototypes and apps based on descriptions or existing designs, such as creating a functional music player that displays a disc that spins when new tracks are played. Specific elements of working design, like text formatting and font style, can be manually edited or adjusted using additional AI prompts. Make is rolling out in beta for full seat Figma users. Figma says it's "exploring integrations with third parties and design systems" for Figma Make and may apply the tool to other apps within its design platform.

Figma Buzz is a marketing-focused design app that's rolling out in beta to all users, and makes it easier for teams to publish brand content, similar to Canva's product design platform. The tool allows Figma designers to create brand-approved templates, styles, and assets that can be used by marketers to quickly assemble emails, social media posts, advertising, and more. Figma Buzz includes generative AI tools for making and editing images using text prompts, and can source information from spreadsheets to bulk create thousands of image assets at once.

Lastly, the Figma Draw vector design app is like a simplified version of Adobe Illustrator that creatives can use to make custom visuals without leaving the Figma platform. It includes a variety of brushes, texture effects, and vector editing tools to create or adjust scalable images and logos for product design projects. Figma Draw is generally available now for full seat users as a toggle in Figma Design, with some features accessible in Sites, Slides, and Buzz. It's not quite as expansive as Adobe's wider Creative Cloud ecosystem, but Figma Draw places the two companies in direct competition for the first time since Adobe killed its own XD product design platform. It also brings some new options to the creative software industry after Adobe failed to acquire Figma for $20 billion due to pressure from competition regulators.
Intel

Intel Says It's Rolling Out Laptop GPU Drivers With 10% To 25% Better Performance (arstechnica.com) 23

Ars Technica's Andrew Cunningham reports: Intel's oddball Core Ultra 200V laptop chips -- codenamed Lunar Lake -- will apparently be a one-off experiment, not to be replicated in future Intel laptop chips. They're Intel's only processors with memory integrated onto the CPU package; the only ones with a neural processing unit that meets Microsoft's Copilot+ performance requirements; and the only ones with Intel's best-performing integrated GPUs, the Intel Arc 130V and 140V.

Today, Intel announced some updates to its graphics driver that specifically benefit those integrated GPUs, welcome news for anyone who bought one and is trying to get by with it as an entry-level gaming system. Intel says that version 32.0.101.6734 of its graphics driver can speed up average frame rates in some games by around 10 percent, and can speed up "1 percent low FPS" (that is, for any given frames per second measurement, whatever your frame rate is the slowest 1 percent of the time) by as much as 25 percent. This should, in theory, make games run better in general and ease some of the stuttering you notice when your game's performance dips down to that 1 percent level.

Operating Systems

OpenBSD 7.7 Released (openbsd.org) 12

Longtime Slashdot reader me34point5 writes: OpenBSD quietly released the new version (7.7) of its "secure by default" operating system. This is the 58th release. Changes include improved hardware and VMM support, along with many kernel improvements. This release brings several specific improvements, including performance boosts on ARM64, Arm SVE support, AMD SEV virtualization enhancements, better low-memory handling on i386, and improved suspend/hibernate and SMP performance. It also updates graphics drivers with support for AMD Ryzen IA 300, Radeon RX 9070, and Intel Arrow Lake, along with expanded hardware support for MediaTek SoCs.

A full list of changes can be found here.
Education

Top Colleges Are Too Costly Even for Parents Making $300,000 (bloomberg.com) 87

Families earning $300,000 annually -- placing them among America's highest earners -- are increasingly finding themselves unable to afford elite college tuition without taking on substantial debt. Bloomberg's analysis of financial aid data from 50 selective colleges reveals households earning between $100,000 and $300,000 occupy a precarious middle ground: too affluent for meaningful aid but insufficiently wealthy to absorb annual costs approaching $100,000.

The squeeze begins around $150,000 income, where families typically contribute 20% ($30,000) annually toward tuition. At $270,000 income, expected contributions reach $61,000 per year. Most institutions eliminate financial aid entirely at approximately $400,000 income. Harvard, MIT, and the University of Pennsylvania recently expanded free tuition thresholds to $200,000, acknowledging this middle-class pressure. The changes take effect for 2025-26.
Ubuntu

Ubuntu 25.04 'Plucky Puffin' Arrives With Linux 6.14, GNOME 48, and ARM64 Desktop ISO (canonical.com) 51

Canonical today released Ubuntu 25.04 "Plucky Puffin," bringing significant upgrades to the non-LTS distribution including Linux kernel 6.14, GNOME 48 with triple buffering, and expanded hardware support.

For the first time, Ubuntu ships an official generic ARM64 desktop ISO targeting virtual machines and Snapdragon-based devices, with initial enablement for the Snapdragon X Elite platform. The release also adds full support for Intel Core Ultra Xe2 integrated graphics and "Battlemage" discrete GPUs, delivering improved ray tracing performance and hardware-accelerated video encoding.

Networking improvements include wpa-psk-sha256 Wi-Fi support and enhanced DNS resolution detection. The installer now better handles BitLocker-protected Windows partitions for dual-boot scenarios. Other notable changes include JPEG XL support by default, NVIDIA Dynamic Boost enabled on supported laptops, Papers replacing Evince as the default document viewer, and APT 3.0 becoming the standard package manager. Ubuntu 25.04 will receive nine months of support until January 2026.
Linux

Linus Torvalds Gently Criticizes Build-Slowing Testing Code Left in Linux 6.15-rc1 (phoronix.com) 25

"The big set of open-source graphics driver updates for Linux 6.15 have been merged," writes Phoronix, "but Linux creator Linus Torvalds isn't particularly happy with the pull request." The new "hdrtest" code is for the Intel Xe kernel driver and is around trying to help ensure the Direct Rendering Manager header files are self-contained and pass kernel-doc tests — basic maintenance checks on the included DRM header files to ensure they are all in good shape.
But Torvalds accused the code of not only slowing down the full-kernel builds, but also leaving behind "random" files for dependencies "that then make the source tree nasty," reports Tom's Hardware: While Torvalds was disturbed by the code that was impacting the latest Linux kernel, beginning his post with a "Grr," he remained precise in his objections to it. "I did the pull, resolved the (trivial) conflicts, but I notice that this ended up containing the disgusting 'hdrtest' crap that (a) slows down the build because it's done for a regular allmodconfig build rather than be some simple thing that you guys can run as needed (b) also leaves random 'hdrtest' turds around in the include directories," he wrote.

Torvalds went on to state that he had previously complained about this issue, and inquired why the hdr testing is being done as a regular part of the build. Moreover, he highlighted that the resulting 'turds' were breaking filename completion. Torvalds underlined this point — and his disgust — by stating, "this thing needs to *die*." In a shot of advice to fellow Linux developers, Torvalds said, "If you want to do that hdrtest thing, do it as part of your *own* checks. Don't make everybody else see that disgusting thing...."

He then noted that he had decided to mark hdrtest as broken for now, to prevent its inclusion in regular builds.

As of Saturday, all of the DRM-Next code had made it into Linux 6.15 Git, notes Phoronix. "But Linus Torvalds is expecting all this 'hdrtest' mess to be cleaned up."
Science

A New Image File Format Efficiently Stores Invisible Light Data (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: Imagine working with special cameras that capture light your eyes can't even see -- ultraviolet rays that cause sunburn, infrared heat signatures that reveal hidden writing, or specific wavelengths that plants use for photosynthesis. Or perhaps using a special camera designed to distinguish the subtle visible differences that make paint colors appear just right under specific lighting. Scientists and engineers do this every day, and they're drowning in the resulting data. A new compression format called Spectral JPEG XL might finally solve this growing problem in scientific visualization and computer graphics. Researchers Alban Fichet and Christoph Peters of Intel Corporation detailed the format in a recent paper published in the Journal of Computer Graphics Techniques (JCGT). It tackles a serious bottleneck for industries working with these specialized images. These spectral files can contain 30, 100, or more data points per pixel, causing file sizes to balloon into multi-gigabyte territory -- making them unwieldy to store and analyze.

[...] The current standard format for storing this kind of data, OpenEXR, wasn't designed with these massive spectral requirements in mind. Even with built-in lossless compression methods like ZIP, the files remain unwieldy for practical work as these methods struggle with the large number of spectral channels. Spectral JPEG XL utilizes a technique used with human-visible images, a math trick called a discrete cosine transform (DCT), to make these massive files smaller. Instead of storing the exact light intensity at every single wavelength (which creates huge files), it transforms this information into a different form. [...]

According to the researchers, the massive file sizes of spectral images have reportedly been a real barrier to adoption in industries that would benefit from their accuracy. Smaller files mean faster transfer times, reduced storage costs, and the ability to work with these images more interactively without specialized hardware. The results reported by the researchers seem impressive -- with their technique, spectral image files shrink by 10 to 60 times compared to standard OpenEXR lossless compression, bringing them down to sizes comparable to regular high-quality photos. They also preserve key OpenEXR features like metadata and high dynamic range support.
The report notes that broader adoption "hinges on the continued development and refinement of the software tools that handle JPEG XL encoding and decoding."

Some scientific applications may also see JPEG XL's lossy approach as a drawback. "Some researchers working with spectral data might readily accept the trade-off for the practical benefits of smaller files and faster processing," reports Ars. "Others handling particularly sensitive measurements might need to seek alternative methods of storage."
Graphics

Nvidia's GeForce RTX 5090 Laptop Graphics Benchmarks Revealed 30

MojoKid writes: Similar to Nvidia's recent desktop graphics launches, there are four initial GeForce RTX 50 series laptop GPUs coming to market, starting this month. At the top of the stack is the GeForce RTX 5090 laptop GPU, which is equipped with 10,496 CUDA cores and is paired to 24GB of memory. Boost clocks top out around 2,160MHz and GPU power can range from 95-150 watts, depending on the particular laptop model. GeForce RTX 50 series GPUs for both laptops and desktops feature updated shader cores with support for neural shading, in addition to 4th gen ray tracing cores and 5th gen Tensor cores with support for DLSS 4. The GeForce RTX 50 series features a native PCIe gen 5 interface, in addition to support for DisplayPort 2.1b (up to UHBR20). These GPUs are also fed by the latest high speed GDDR7 memory, which offers efficiency benefits that are pertinent to laptop designs as well. Performance-wise, NVIDIA's mobile GeForce RTX 5090 is the new king of the hill in gaming laptops, and it easily bests all other discrete mobile graphics options on the market currently.
Operating Systems

Linux Kernel 6.14 Is a Big Leap Forward In Performance, Windows Compatibility (zdnet.com) 34

An anonymous reader quotes a report from ZDNet, written by Steven Vaughan-Nichols: Despite the minor delay, Linux 6.14 arrives packed with cutting-edge features and improvements to power upcoming Linux distributions, such as the forthcoming Ubuntu 25.04 and Fedora 42. The big news for desktop users is the improved NTSYNC driver, especially those who like to play Windows games or run Windows programs on Linux. This driver is designed to emulate Windows NT synchronization primitives. What that feature means for you and me is that it will significantly improve the performance of Windows programs running on Wine and Steam Play. [...] Gamers always want the best possible graphics performance, so they'll also be happy to see that Linux now supports recently launched AMD RDNA 4 graphics cards. This approach includes support for the AMD Radeon RX 9070 XT and RX 9070 graphics cards. Combine this support with the recently improved open-source RADV driver and AMD gamers should see the best speed yet on their gaming rigs.

Of course, the release is not just for gamers. Linux 6.14 also includes several AMD and Intel processor enhancements. These boosts focus on power management, thermal control, and compute performance optimizations. These updates are expected to improve overall system efficiency and performance. This release also comes with the AMDXDNA driver, which provides official support for AMD's neural processing units based on the XDNA architecture. This integration enables efficient execution of AI workloads, such as convolutional neural networks and large language models, directly on supported AMD hardware. While Rust has faced some difficulties in recent months in Linux, more Rust programming language abstractions have been integrated into the kernel, laying the groundwork for future drivers written in Rust. [...] Besides drivers, Miguel Ojeda, Rust for Linux's lead developer, said recently that the introduction of the macro for smart pointers with Rust 1.84: derive(CoercePointee) is an "important milestone on the way to building a kernel that only uses stable Rust functions." This approach will also make integrating C and Rust code easier. We're getting much closer to Rust being grafted into Linux's tree.

In addition, Linux 6.14 supports Qualcomm's latest Snapdragon 8 Elite mobile processor, enhancing performance and stability for devices powered by this chipset. That support means you can expect to see much faster Android-based smartphones later this year. This release includes a patch for the so-called GhostWrite vulnerability, which can be used to root some RISC-V processors. This fix will block such attacks. Additionally, Linux 6.14 includes improvements for the copy-on-write Btrfs file system/logical volume manager. These primarily read-balancing methods offer flexibility for different RAID hardware configurations and workloads. Additionally, support for uncached buffered I/O optimizes memory usage on systems with fast storage devices.
Linux 6.14 is available for download here.
AMD

Lisa Su Says Radeon RX 9000 Series Is AMD's Most Successful GPU Launch Ever (techspot.com) 32

"In a conversation with Tony Yu from Asus China, AMD CEO Lisa Su shared that the Radeon RX 9000 series graphics cards have quickly become a huge hit, breaking records as AMD's top-selling GPUs within just a week of release," writes Slashdot reader jjslash. TechSpot reports: AMD CEO Lisa Su has confirmed that the company's new Radeon RX 9000 graphics cards have been a massive success, selling 10 times more units than their predecessors in just one week on the market. Su also stated that more RDNA 4 cards are on the way, but did not confirm whether the lineup will include the rumored Radeon RX 9060. When asked about the limited availability of the new cards, Su said that AMD is ramping up production to ensure greater supply at retailers worldwide. She also expressed hope that increased availability would help stabilize pricing by discouraging scalping and price gouging.
Open Source

Developer Loads Steam On a $100 ARM Single Board Computer (interfacinglinux.com) 24

"There's no shortage of videos showing Steam running on expensive ARM single-board computers with discrete GPUs," writes Slashdot reader VennStone. "So I thought it would be worthwhile to make a guide for doing it on (relatively) inexpensive RK3588-powered single-board computers, using Box86/64 and Armbian." The guides I came across were out of date, had a bunch of extra steps thrown in, or were outright incorrect... Up first, we need to add the Box86 and Box64 ARM repositories [along with dependencies, ARMHF architecture, and the Mesa graphics driver]...
The guide closes with a multi-line script and advice to "Just close your eyes and run this. It's not pretty, but it will download the Steam Debian package, extract the needed bits, and set up a launch script." (And then the final step is sudo reboot now.)

"At this point, all you have to do is open a terminal, type 'steam', and tap Enter. You'll have about five minutes to wait... Check out the video to see how some of the tested games perform." At 720p, performance is all over the place, but the games I tested typically managed to stay above 30 FPS. This is better than I was expecting from a four-year-old SOC emulating x86 titles under ARM.

Is this a practical way to play your Steam games? Nope, not even a little bit. For now, this is merely an exercise in ludicrous neatness. Things might get a wee bit better, considering Collabora is working on upstream support for RK3588 and Valve is up to something ARM-related, but ya know, "Valve Time"...

"You might be tempted to enable Steam Play for your Windows games, but don't waste your time. I mean, you can try, but it ain't gonna work."
IT

Nvidia Sells RTX GPUs From a 'Food Truck' (pcworld.com) 33

Nvidia is selling its scarce RTX 5080 and 5090 graphics cards from a pop-up "food truck" at its GPU Technology Conference, where attendees paying over $1,000 for tickets can purchase the coveted hardware alongside merchandise. The company has only 2,000 cards available (1,000 each of RTX 5080 and 5090), released in small batches at random times during the three-day conference which concludes tomorrow.
GNOME

GNOME 48 Released (9to5linux.com) 60

prisoninmate writes: GNOME 48 desktop environment has been released after six months of development with major new features that have been expected for more than four years, such as dynamic triple buffering, HDR support, and much more. 9to5Linux reports:

"Highlights of GNOME 48 include dynamic triple buffering to boost the performance on low-end GPUs, such as Intel integrated graphics or Raspberry Pi computers, Wayland color management protocol support, new Adwaita fonts, HDR (High Dynamic Range) support, and a new Wellbeing feature with screen time tracking.

"GNOME 48 also introduces a new GNOME Display Control (gdctl) utility to view the active monitor configuration and set new monitor configuration using command line arguments, implements a11y keyboard monitoring support, adds output luminance settings, and it now centers new windows by default."

Transportation

GM Taps Nvidia To Boost Its Self-Driving Projects 11

General Motors is partnering with Nvidia to enhance its self-driving and manufacturing capabilities by leveraging Nvidia's AI chips, software, and simulation tools. "GM says it will apply several of Nvidia's products to its business, such as the Omniverse 3D graphics platform which will run simulations on virtual assembly lines with an eye on reducing downtime and improving efficiency," reports The Verge. "The automaker also plans to equip its next-generation vehicles with Nvidia's 'AI brain' for advanced driver assistance and autonomous driving. And it will employ the chipmaker's AI training software to make its vehicle assembly line robots better at certain tasks, like precision welding and material handling." From the report: GM already uses Nvidia's GPUs to train its AI software for simulation and validation. Today's announcement was about expanding those use cases into improving its manufacturing operations and autonomous vehicles, GM CEO Mary Barra said in a statement. (Dave Richardson, GM's senior VP of Software and Services Engineering will be joining NVIDIA's Norm Marks for a fireside chat at the conference.) "AI not only optimizes manufacturing processes and accelerates virtual testing but also helps us build smarter vehicles while empowering our workforce to focus on craftsmanship," Barra said. "By merging technology with human ingenuity, we unlock new levels of innovation in vehicle manufacturing and beyond."

GM will adopt Nvidia's in-car software products to build next-gen vehicles with autonomous driving capabilities. That includes the company's Drive AGX system-on-a-chip (SoC), similar to Tesla's Full Self-Driving chip or Intel's Mobileye EyeQ. The SoC runs the "safety-certified" DriveOS operating system, built on the Blackwell GPU architecture, which is capable of delivering 1,000 trillion operations per second (TOPS) of high-performance compute, the company says. [...] In a briefing with reporters, Ali Kani, Nvidia's vice president and general manager of automotive, described the chipmaking company's automotive business as still in its "infancy," with the expectation that it will only bring in $5 billion this year. (Nvidia reported over $130 billion in revenue in 2024 for all its divisions.)

Nvidia's chips are in less than 1 percent of the billions of cars on the road today, he added. But the future looks promising. The company is also announcing deals with Tier 1 auto supplier Magna, which helped build Sony's Afeela concept, to use Drive AGX in the company's next-generation advanced driver assist software. "We believe automotive is a trillion dollar opportunity for Nvidia," Kani said.
Graphics

GIMP 3.0 Released (9to5linux.com) 52

GIMP 3.0 has been released after over a decade of development. Highlights include a refined GTK3 interface with scroll wheel tab navigation, a new splash screen, improved HiDPI icon support, enhanced color management, a stable public API, and support for more file formats. 9to5Linux reports: GIMP 3.0 also brings improvements to non-destructive editing by introducing an optional "Merge Filters" checkbox at the bottom of NDE filters that merges down the filter immediately after it's committed, along with non-destructive filters on layer groups and the implementation of storing version of filters in GIMP's XCF project files. Among other noteworthy changes, the GEGL and babl components have been updated with new features and many improvements, such as Inner Glow, Bevel, and GEGL Styles filters, some plugins saw small enhancements, and it's now possible to export images with different settings while leaving the original image unchanged.

There's also a new PDB call that allows Script-Fu writers to use labels to specify filter properties, a brand new named-argument syntax, support for loading 16-bits-per-channel LAB PSD files, support for loading DDS images with BC7 support, early-binding CMYK support, and support for PSB and JPEG-XL image formats. On top of that, GIMP 3.0 introduces new auto-expanding layer boundary and snapping options, an updated search pop-up to show the menu path for all entries while making individual filters searchable, a revamped alignment tool, and support for "layer sets," replacing the older concept of linked layers.
You can download GIMP 3.0 from the official website.
Open Source

Startup Claims Its Upcoming (RISC-V ISA) Zeus GPU is 10X Faster Than Nvidia's RTX 5090 (tomshardware.com) 69

"The number of discrete GPU developers from the U.S. and Western Europe shrank to three companies in 2025," notes Tom's Hardware, "from around 10 in 2000." (Nvidia, AMD, and Intel...) No company in the recent years — at least outside of China — was bold enough to engage into competition against these three contenders, so the very emergence of Bolt Graphics seems like a breakthrough. However, the major focuses of Bolt's Zeus are high-quality rendering for movie and scientific industries as well as high-performance supercomputer simulations. If Zeus delivers on its promises, it could establish itself as a serious alternative for scientific computing, path tracing, and offline rendering. But without strong software support, it risks struggling against dominant market leaders.
This week the Sunnyvale, California-based startup introduced its Zeus GPU platform designed for gaming, rendering, and supercomputer simulations, according to the article. "The company says that its Zeus GPU not only supports features like upgradeable memory and built-in Ethernet interfaces, but it can also beat Nvidia's GeForce RTX 5090 by around 10 times in path tracing workloads, according to slide published by technology news site ServeTheHome." There is one catch: Zeus can only beat the RTX 5090 GPU in path tracing and FP64 compute workloads. It's not clear how well it will handle traditional rendering techniques, as that was less of a focus. In speaking with Bolt Graphics, the card does support rasterization, but there was less emphasis on that aspect of the GPU, and it may struggle to compete with the best graphics cards when it comes to gaming. And when it comes to data center options like Nvidia's Blackwell B200, it's an entirely different matter.

Unlike GPUs from AMD, Intel, and Nvidia that rely on proprietary instruction set architectures, Bolt's Zeus relies on the open-source RISC-V ISA, according to the published slides. The Zeus core relies on an open-source out-of-order general-purpose RVA23 scalar core mated with FP64 ALUs and the RVV 1.0 (RISC-V Vector Extension Version 1.0) that can handle 8-bit, 16-bit, 32-bit, and 64-bit data types as well as Bolt's additional proprietary extensions designed for acceleration of scientific workloads... Like many processors these days, Zeus relies on a multi-chiplet design... Unlike high-end GPUs that prioritize bandwidth, Bolt is evidently focusing on greater memory size to handle larger datasets for rendering and simulations. Also, built-in 400GbE and 800GbE ports to enable faster data transfer across networked GPUs indicates the data center focus of Zeus.

High-quality rendering, real-time path tracing, and compute are key focus areas for Zeus. As a result, even the entry-level Zeus 1c26-32 offers significantly higher FP64 compute performance than Nvidia's GeForce RTX 5090 — up to 5 TFLOPS vs. 1.6 TFLOPS — and considerably higher path tracing performance: 77 Gigarays vs. 32 Gigarays. Zeus also features a larger on-chip cache than Nvidia's flagship — up to 128MB vs. 96MB — and lower power consumption of 120W vs. 575W, making it more efficient for simulations, path tracing, and offline rendering. However, the RTX 5090 dominates in AI workloads with its 105 FP16 TFLOPS and 1,637 INT8 TFLOPS compared to the 10 FP16 TFLOPS and 614 INT8 TFLOPS offered by a single-chiplet Zeus...

The article emphasizes that Zeus "is only running in simulation right now... Bolt Graphics says that the first developer kits will be available in late 2025, with full production set for late 2026."

Thanks to long-time Slashdot reader arvn for sharing the news.
Microsoft

Microsoft Admits GitHub Hosted Malware That Infected Almost a Million Devices (theregister.com) 17

Microsoft has spotted a malvertising campaign that downloaded nastyware hosted on GitHub and exposed nearly a million devices to information thieves. From a report: Discovered by Microsoft Threat Intelligence late last year, the campaign saw pirate vid-streaming websites embed malvertising redirectors to generate pay-per-view or pay-per-click revenue from malvertising platforms. "These redirectors subsequently routed traffic through one or two additional malicious redirectors, ultimately leading to another website, such as a malware or tech support scam website, which then redirected to GitHub" according to Microsoft's threat research team.

GitHub hosted a first-stage payload that installed code that dropped two other payloads. One gathered system configuration info such as data on memory size, graphics capabilities, screen resolution, the operating system present, and user paths. Third-stage payloads varied but most "conducted additional malicious activities such as command and control (C2) to download additional files and to exfiltrate data, as well as defense evasion techniques."

Apple

Apple Unveils iPad Air With M3 Chip (apple.com) 42

Apple today announced a significant update to its iPad Air lineup, integrating the M3 chip previously reserved for higher-end devices. The new tablets, available in both 11-inch ($599) and 13-inch ($799) configurations, deliver substantial performance gains: nearly 2x faster than M1-equipped models and 3.5x faster than A14 Bionic versions.

The M3 brings Apple's advanced graphics architecture to the Air for the first time, featuring dynamic caching, hardware-accelerated mesh shading, and ray tracing. The chip includes an 8-core CPU delivering 35% faster multithreaded performance over M1, paired with a 9-core GPU offering 40% faster graphics. The Neural Engine processes AI workloads 60% faster than M1, the company said. Apple also introduced a redesigned Magic Keyboard ($269/$319) with function row and larger trackpad.
DRM

'Why Can't We Screenshot Frames From DRM-Protected Video on Apple Devices?' (daringfireball.net) 82

Apple users noticed a change in 2023, "when streaming platforms like Netflix, HBO Max, Amazon Prime, and the Criterion Channel imposed a quiet embargo on the screenshot," noted the film blog Screen Slate: At first, there were workarounds: users could continue to screenshot by using the browser Brave or by downloading extensions or third-party tools like Fireshot. But gradually, the digital-rights-management tech adapted and became more sophisticated. Today, it is nearly impossible to take a screenshot from the most popular streaming services, at least not on a Macintosh computer. The shift occurred without remark or notice to subscribers, and there's no clear explanation as to why or what spurred the change...

For PC users, this story takes a different, and happier, turn. With the use of Snipping Tool — a utility exclusive to Microsoft Windows, users are free to screen grab content from all streaming platforms. This seems like a pointed oversight, a choice on the part of streamers to exclude Mac users (though they make up a tiny fraction of the market) because of their assumed cultural class.

"I'm not entirely sure what the technical answer to this is," tech blogger John Gruber wrote this weekend, "but on MacOS, it seemingly involves the GPU and video decoding hardware..." These DRM blackouts on Apple devices (you can't capture screenshots from DRM video on iPhones or iPads either) are enabled through the deep integration between the OS and the hardware, thus enabling the blackouts to be imposed at the hardware level. And I don't think the streaming services opt into this screenshot prohibition other than by "protecting" their video with DRM in the first place. If a video is DRM-protected, you can't screenshot it; if it's not, you can.

On the Mac, it used to be the case that DRM video was blacked-out from screen capture in Safari, but not in Chrome (or the dozens of various Chromium-derived browsers). But at some point a few years back, you stopped being able to capture screenshots from DRM videos in Chrome, too -- by default. But in Chrome's Settings page, under System, if you disable "Use graphics acceleration when available" and relaunch Chrome, boom, you can screenshot everything in a Chrome window, including DRM video...

What I don't understand is why Apple bothered supporting this in the first place for hardware-accelerated video (which is all video on iOS platforms -- there is no workaround like using Chrome with hardware acceleration disabled on iPhone or iPad). No one is going to create bootleg copies of DRM-protected video one screenshotted still frame at a time -- and even if they tried, they'd be capturing only the images, not the sound. And it's not like this "feature" in MacOS and iOS has put an end to bootlegging DRM-protected video content.

Gruber's conclusion? "This 'feature' accomplishes nothing of value for anyone, including the streaming services, but imposes a massive (and for most people, confusing and frustrating) hindrance on honest people simply trying to easily capture high-quality (as opposed to, say, using their damn phone to take a photograph of their reflective laptop display) screenshots of the shows and movies they're watching."
Movies

Blender-Rendered Movie 'Flow' Wins Oscar for Best Animated Feature, Beating Pixar (blender.org) 72

It's a feature-length film "rendered on a free and open-source software platform called Blender," reports Reuters. And it just won the Oscar for best animated feature film, beating movies from major studios like Disney/Pixar and Dreamworks.

In January Blender.org called Flow "the manifestation of Blender's mission, where a small, independent team with a limited budget is able to create a story that moves audiences worldwide, and achieve recognition with over 60 awards, including a Golden Globe for Best Animation and two Oscar nominations." The entire project cost just $3.7 million, reports NPR — though writer/director Gints Zilbalodis tells Blender.org that it took about five and a half years.

"I think a certain level of naivety is necessary when starting a project," Zilbalodis tells Blender. "If I had known how difficult it would be, I might never have started. But because I didn't fully grasp the challenges ahead, I just dove in and figured things out along the way..." Zilbalodis: [A]fter making a few shorts, I realized that I'm not good at drawing, and I switched to 3D because I could model things, and move the camera... After finishing my first feature Away, I decided to switch to Blender [from Maya] in 2019, mainly because of EEVEE... It took a while to learn some of the stuff, but it was actually pretty straightforward. Many of the animators in Flow took less than a week to switch to Blender...

I've never worked in a big studio, so I don't really know exactly how they operate. But I think that if you're working on a smaller indie-scale project, you shouldn't try to copy what big studios do. Instead, you should develop a workflow that best suits you and your smaller team.

You can get a glimpse of their animation style in Flow's official trailer.

NPR says that ultimately Flow's images "possess a kinetic elegance. They have the alluring immersiveness of a video game..."

Slashdot Top Deals