Programming

How Python Now Manages Its Evolution (techradar.com) 62

For roughly a year and a half software engineer Pablo Galindo has been one of five members on the Python Steering Council, which took the reins when language creator Guido van Rossum stepped down. "The Python Steering Council attempts to reflect the decisions of the community, weighing up all the advantages and disadvantages [of each proposal]," Galindo explains in TechRadar's look at how the language now manages its evolution. (Alternate URL here.)

"Our responsibility is to make sure everyone is represented in a decision. It's not about what we think personally, it's about the community mind." So while static typing would've benefited one specific sub-community, the article argues, the necessary changes were ultimately "deemed by the council to have an overall detrimental effect," the article points out, "and were therefore rejected." Given the popularity of Python and size of the application base, the Steering Council has to exercise considerable caution when deciding upon changes to the language. Broadly, the goal is to improve the level of performance and range of functionality in line with the demands of the community, but doing so is rarely straightforward. "There is an important distinction between making a new language fast, versus increasing the performance of a 30-year-old language without breaking the code," noted Galindo. "That is extremely difficult; I cannot tell you how difficult it is."

"There are a number of industry techniques that everyone uses [to improve performance], but Python is incompatible with these methods. Instead, we have to develop entirely new techniques to achieve only similarly good results."

Separately, the team has to worry about the knock-on effects of a poorly-implemented change, of which there could be many. As an example, Galindo gestured towards the impact of a drop-off in language performance on energy usage (and therefore carbon emissions). "When you make changes in the language, it can be daunting," he said. "How many CPU cycles will I cost the planet with a mistake...?"

Despite the various headwinds, the Python Steering Council has lofty ambitions for the language, with the next major release (version 3.11) set to go live in October. Apparently, speed is the first item on the agenda. Galindo told us the aim is to improve performance by up to 60% (depending on the workload) with Python 3.11 and again with version 3.12. In the longer term, meanwhile, the goal is to make the language between two and five times faster within the next decade.

The council will also continue to focus on improving the quality of error messages generated by the Python Interpreter in an effort to make debugging much simpler, a pet project of Galindo's and a major focus during his time on the council.

Google

Google Will Let European Developers Use Their Own Billing Systems (theverge.com) 19

Google will start allowing the developers of non-gaming apps in the European Economic Area (EEA) to offer alternate payment systems. In a blog post, Google outlines its plans to comply with the Digital Markets Act (or DMA), a piece of legislation aimed at regulating big tech. From a report: The DMA passed through the European Parliament earlier this month, but it isn't expected to go into force until spring 2023. But Google is rolling out the changes ahead of time to make sure that its plans "serve the needs" of users.

The legislation requires "gatekeepers," or companies with a market capitalization of $75.8 billion or over, to follow a set of rules meant to promote competition among digital platforms. Failing to comply could lead to fines of up to 10 percent of a firm's global revenue or 20 percent in case of repeat offenses. Android developers who choose to use an alternate payment processor will still have to pay Google a service fee for each transaction on the first $1 million they make within one year. However, Google says it will reduce this fee by 3 percent, meaning the company will take a 12 percent or lower cut from every transaction. If developers make more than $1 million in one year, Google will charge developers a 27 percent fee on transactions (3 percent less than the standard 30 percent).

GNU is Not Unix

GCC Rust Approved by Steering Committee, Beta Likely Next April (phoronix.com) 51

Phoronix reports: The GCC Steering Committee has approved of the GCC Rust front-end providing Rust programming language support by the GNU Compiler Collection. This Rust front-end will likely be merged ahead of the GCC 13 release next year.

The GCC Steering Committee this morning has announced that the Rust front-end "GCC Rust" is appropriate for inclusion into the GCC mainline code-base. This is the effort that has been in the works for a while as an alternative to Rust's official LLVM-based compiler. GCC Rust is still under active development but is getting into shape for mainlining.

The hope is to have at least "beta" level support for the Rust programming language in GCC 13, which will be released as stable around April of next year.

Programming

Ask Slashdot: Does WebAssembly Increase Your Web Browser's Attack Surface? (github.com) 104

Steve Springett is a conscientious senior security architect. And in 2018, he published an essay on GitHub arguing that from a security engineer's perspective, WebAssembly "increases the attack surface of any browser that supports it."

Springett wrote that WebAssembly modules are sent in (unsigned) binary format — without a transport-layer security mechanism — and rely on browser sandboxing for safety. But the binary format makes it harder to analyze the code, while sandboxing "is prone to breakouts and effectiveness varies largely by implementation. Adobe Flash is an example of a technology that was sandboxed after a series of exploits, yet exploits and breakouts still occurred." Springett even went so far as to offer the commands for switching off WebAssembly in your browser.

Now Tablizer (Slashdot reader #95,088) wants to know what other Slashdot readers think of Spingett's security concrens around WebAssembly.

And also offers this suggestion to browser makers: Browsers should have a way to easily disable WebAssembly — including whitelisting. For example, if you need it for specific gaming site, you can whitelist just that site and not have WASM exposed for other sites.
Programming

Top Languages for WebAssembly Development: Rust, C++, Blazor, Go - and JavaScript? (visualstudiomagazine.com) 49

This year's "State of WebAssembly" report has been published by Colin Eberhardt (CTO at the U.K.-based software consultancy Scott Logic). Hundreds of people were surveyed for the report, notes this article by Visual Studio Magazine.

Published by B2B media company 1105 Media, the magazine notes that Eberhardt's survey included some good news for Rust — and for Microsoft's free open source framework Blazor (for building web apps using C# and HTML): This year, like last year, Rust was found to be the most frequently used and most desired programming language for WebAssembly development.... "Rust once again comes out on top, with 45 percent saying they use it frequently or sometimes," Eberhardt said. "WebAssembly and Rust do have quite a close relationship, most WebAssembly runtimes are written in Rust, as are the various platforms based on wasm. It also enjoys some of the best tooling, so this result doesn't come as a big surprise."

While Rust usage and desirability has continued to climb, the Blazor web-dev framework is coming on strong in the report, which treats Blazor as a programming language, though it's not. On that desirability scale, Blazor climbed from sixth spot in 2021 to fourth this year among seven "programming languages" [based on] percentage of respondents who use a given language 'frequently,' or 'sometimes' [for WebAssembly development] compared to last year. Eberhardt said, "Rust has had a modest rise in desirability, but the biggest climber is Blazor, with Go following just behind."

Commenting on another graphic that shows which language people most want to use for WebAssembly development, Eberhardt said, "This shows that Rust usage has climbed steadily, but the biggest climbers are Blazor and Python.

While you can now compile WebAssembly from a variety of languages (including C, #C, and C++), the report also found that JavaScript has somehow become a viable WebAssembly language — sort of, and even though JavaScript itself can't be compiled to WebAssembly... There's a cunning workaround for this challenge; rather than compiling JS to Wasm, you can instead compile a JavaScript engine to WebAssembly then use that to execute your code.

This is actually much more practical than you might think.

The Military

DARPA Is Worried About How Well Open-Source Code Can Be Trusted (technologyreview.com) 85

An anonymous reader quotes a report from MIT Technology Review: "People are realizing now: wait a minute, literally everything we do is underpinned by Linux," says Dave Aitel, a cybersecurity researcher and former NSA computer security scientist. "This is a core technology to our society. Not understanding kernel security means we can't secure critical infrastructure." Now DARPA, the US military's research arm, wants to understand the collision of code and community that makes these open-source projects work, in order to better understand the risks they face. The goal is to be able to effectively recognize malicious actors and prevent them from disrupting or corrupting crucially important open-source code before it's too late. DARPA's "SocialCyber" program is an 18-month-long, multimillion-dollar project that will combine sociology with recent technological advances in artificial intelligence to map, understand, and protect these massive open-source communities and the code they create. It's different from most previous research because it combines automated analysis of both the code and the social dimensions of open-source software.

Here's how the SocialCyber program works. DARPA has contracted with multiple teams of what it calls "performers," including small, boutique cybersecurity research shops with deep technical chops. One such performer is New York -- based Margin Research, which has put together a team of well-respected researchers for the task. Margin Research is focused on the Linux kernel in part because it's so big and critical that succeeding here, at this scale, means you can make it anywhere else. The plan is to analyze both the code and the community in order to visualize and finally understand the whole ecosystem.

Margin's work maps out who is working on what specific parts of open-source projects. For example, Huawei is currently the biggest contributor to the Linux kernel. Another contributor works for Positive Technologies, a Russian cybersecurity firm that -- like Huawei -- has been sanctioned by the US government, says Aitel. Margin has also mapped code written by NSA employees, many of whom participate in different open-source projects. "This subject kills me," says d'Antoine of the quest to better understand the open-source movement, "because, honestly, even the most simple things seem so novel to so many important people. The government is only just realizing that our critical infrastructure is running code that could be literally being written by sanctioned entities. Right now." This kind of research also aims to find underinvestment -- that is critical software run entirely by one or two volunteers. It's more common than you might think -- so common that one common way software projects currently measure risk is the "bus factor": Does this whole project fall apart if just one person gets hit by a bus?
SocialCyber will also tackle other open-source projects too, such as Python which is "used in a huge number of artificial-intelligence and machine-learning projects," notes the report. "The hope is that greater understanding will make it easier to prevent a future disaster, whether it's caused by malicious activity or not."
Programming

Hundreds of Tech, Business and Nonprofit Leaders Urge States To Boost CS Education 49

theodp writes: In partnership with tech-bankrolled nonprofit Code.org, over 500 of the nation's business, education and nonprofit leaders issued a letter calling for state governments and education leaders to bring more Computer Science to K-12 students across the U.S. The signatories include a who's who of tech leaders, including Bill Gates, Jeff Bezos, Satya Nadella, Steve Ballmer, Tim Cook, Sundar Pichai, and Mark Zuckerberg. A new website -- CEOs for CS -- was launched in conjunction with the campaign. "The United States leads the world in technology, yet only 5% of our high school students study computer science. How is this acceptable?" the CEOs demand to know in their letter addressed "To the Governors and Education Leaders of the United States of America." They add, "Nearly two-thirds of high-skilled immigration is for computer scientists, and every state is an importer of this strategic talent. The USA has over 700,000 open computing jobs but only 80,000 computer science graduates a year. We must educate American students as a matter of national competitiveness."

A press release explains that the announcement "coincides with the culmination of the National Governors Association Chairman's Initiative for K-12 computer science, led by Arkansas Gov. Asa Hutchinson." Hutchinson is a founding Governor of the Code.org-led advocacy group Govs for CS, which launched in anticipation of President Obama's tech-supported but never materialized $4 billion CS for All initiative. Hutchinson was a signatory of an earlier 2016 Code.org organized letter from Governors, business, education, and nonprofit leaders that implored Congress to make CS education for K-12 students a priority.
Security

PyPI Is Rolling Out 2FA For Critical Projects, Giving Away 4,000 Security Keys (zdnet.com) 19

PyPI or the Python Package Index is giving away 4,000 Google Titan security keys as part of its move to mandatory two-factor authentication (2FA) for critical projects built in the Python programming language. ZDNet reports: PyPI, which is managed by the Python Software Foundation, is the main repository where Python developers can get third-party developed open-source packages for their projects. [...] One way developers can protect themselves from stolen credentials is by using two-factor authentication and the PSF is now making it mandatory for developers behind "critical projects" to use 2FA in coming months. PyPI hasn't declared a specific date for the requirement. "We've begun rolling out a 2FA requirement: soon, maintainers of critical projects must have 2FA enabled to publish, update, or modify them," the PSF said on its PyPI Twitter account.

As part of the security drive, it is giving away 4,000 Google Titan hardware security keys to project maintainers gifted by Google's open source security team. "In order to improve the general security of the Python ecosystem, PyPI has begun implementing a two-factor authentication (2FA) requirement for critical projects. This requirement will go into effect in the coming months," PSF said in a statement. "To ensure that maintainers of critical projects have the ability to implement strong 2FA with security keys, the Google Open Source Security Team, a sponsor of the Python Software Foundation, has provided a limited number of security keys to distribute to critical project maintainers.

PSF says it deems any project in the top 1% of downloads over the prior six months as critical. Presently, there are more than 350,000 projects on PyPI, meaning that more than 3,500 projects are rated as critical. PyPI calculates this on a daily basis so the Titan giveaway should go a long way to cover a chunk of key maintainers but not all of them. In the name of transparency, PyPI is also publishing 2FA account metrics here. There are currently 28,336 users with 2FA enabled, with nearly 27,000 of them using a 2FA app like Microsoft Authenticator. There are over 3,800 projects rated as "critical" and 8,241 PyPI users in this group. The critical group is also likely to grow since projects that have been designated as critical remain so indefinitely while new projects are added to mandatory 2FA over time. The 2FA rule applies to both project maintainers and owners.

Programming

Meet Bun, a Speedy New JavaScript Runtime (bun.sh) 121

Bun is "a modern JavaScript runtime like Node or Deno," according to its newly-launched web site, "built from scratch to focus on three main things."

- Start fast (it has the edge in mind).
- New levels of performance (extending JavaScriptCore, the engine).
- Being a great and complete tool (bundler, transpiler, package manager).

Bun is designed as a drop-in replacement for your current JavaScript & TypeScript apps or scripts — on your local computer, server or on the edge. Bun natively implements hundreds of Node.js and Web APIs, including ~90% of Node-API functions (native modules), fs, path, Buffer and more. [And Bun also implements Node.js' module resolution algorithm, so you can use npm packages in bun.js]

The goal of Bun is to run most of the world's JavaScript outside of browsers, bringing performance and complexity enhancements to your future infrastructure, as well as developer productivity through better, simpler tooling.... Why is Bun fast? An enormous amount of time spent profiling, benchmarking and optimizing things. The answer is different for every part of Bun, but one general theme: [it's written in Zig.] Zig's low-level control over memory and lack of hidden control flow makes it much simpler to write fast software.

An infographic on the site claims its server-side rendering of React is more than three times faster than Node or Deno. And Bun.js can even automatically load environment variables from .env files, according to the site. No more require("dotenv").load()
Hackaday describes it as "a performant all-in-one approach," including "bundling, transpiling, module resolution, and a fantastic foreign-function interface." Many Javascript projects have a bundling and transpiling step that takes the source and packages it together in a more standard format. Typescript needs to be packaged into javascript, and modules need to be resolved. Bun bakes all this in. Typescript and JSX "just work." This dramatically simplifies many projects as much of the build infrastructure is part of Bun itself, lowering cognitive load when trying to understand a project... Some web-specific APIs, such as fetch and Websockets, are also built-in.
"What's even wilder is that Bun is written by one person, Jared Sumner," the article points out — adding that the all the code is available on GitHub under the MIT License ("excluding dependencies which have various licenses.")
Crime

What Happened to the Teen Who Stole $23.8M in Cryptocurrency? (rollingstone.com) 67

15-year-old Ellis Pinsky stole $23.8 million worth of cryptocurrency — and his life was never the same. For example, Rolling Stone reports, in his last year of high school, "Four men wearing ski masks and gloves, armed with knives, rope, brass knuckles, and a fake 9 mm," crept around the back of his home in the suburbs: Two weeks before the break-in, a lawsuit had been filed against him, and news stories had circulated connecting him to the hack. He knew that the thieves wanted this money, the millions and millions of dollars he had stolen. He also knew that he couldn't give it to them. He didn't have it. Not anymore.
The magazine paints the portrait of "an anxious young man in Invisalign braces" who describes the revelation he'd had at the age of 13. "The internet held such secrets. All he had to do was uncover them." As he soon found, there were plenty of people working to uncover them all the time, and willing to share their methods — for a price.... Realizing that a lot of the information social engineers used came from hacked databases, he began teaching himself to program, particularly to do the Structured Query Language injections and cross-site scripting that allowed him to attack companies' database architecture. The terabyte upon terabyte of databases he extracted, traded, and hoarded made him valuable to OGUsers as well as to others, like the Russian hackers he was able to converse with thanks to his fluency with his mother's native language... By the time he was 14, he tells me, "I think it's fair to say I had the capabilities to hack anyone."
The article describes him as "attending high school by day and extracting the source code of major corporations by night.... He was 14 years old and taken with the thrill of possessing a hidden superpower, of spending his nights secretly tapping into an underground world where he was esteemed and even feared. And then, in the morning, being called downstairs to breakfast." He wrote a Python script to comb through social media networks and seek out any mentions of working for a [cellphone] carrier. Then he'd reach out with an offer of compensation for helping him with a task. Every fifth or sixth person — underpaid and often working a short-term contract — would say they were game, as Pinsky tells it. For a couple hundred dollars' worth of bitcoin, they'd be willing to do a SIM swap, no questions asked. Eventually, Pinsky says, he had employees at every major carrier also working for him. Then the stakes got even higher. It was only a matter of time before OG hackers, known to each other as "the Community," realized that if they could use the SIM-swapping method to steal usernames, they could just as easily use it to steal cryptocurrency...
In one massive heist Pinksky stole 10% of all the Trigger altcoins on the market from crypto impresario Michael Terpin. ("As Pinsky's money launderers were converting it, the market was crashing in real time.") Pinsky recruited a crew to launder the money — at least one of which simply kept it — but even with all the conversion fees, he still made off with millions. And then... For a while, he half-expected the FBI to knock on his door at any moment, just like in the movies; but as time passed, he grew less anxious.... He says he moved on to learning different types of programming. He ran a sneaker business that used bots and scripts to snap up limited pairs then flip them... He went to soccer practice. He and his friends had started hanging out with girls on the weekend, driving down to the docks where you could see the glowing lights from the Tappan Zee Bridge.
Until Terpin figured out it was Pinsky who'd robbed him: Pinsky and his legal team preempted his arrest by contacting the U.S. attorney directly and offering his cooperation. In February 2020, he voluntarily returned every last thing he says he got from the Terpin heist: 562 bitcoins, the Patek watch, and the cash he'd stored in the safe under his bed.... When I ask if he has also worked with the FBI to help bring down other hackers, he blinks quickly and then changes the subject.
Pinsky has not been criminally charged — partly because he was a minor, but also because of his cooperation with law enforcement. But filing a civil suit, Terpin wants to be compensated with triple the amount stolen, arguing that the teenager who robbed him was running an organized crime racket and that he should be heavily punished to set an example.

Rolling Stone's article raisees the question: what should happen next?
Programming

Vim 9.0 Released (vim.org) 81

After many years of gradual improvement Vim now takes a big step with a major release. Besides many small additions the spotlight is on a new incarnation of the Vim script language: Vim9 script. Why Vim9 script: A new script language, what is that needed for? Vim script has been growing over time, while preserving backwards compatibility. That means bad choices from the past often can't be changed and compatibility with Vi restricts possible solutions. Execution is quite slow, each line is parsed every time it is executed.

The main goal of Vim9 script is to drastically improve performance. This is accomplished by compiling commands into instructions that can be efficiently executed. An increase in execution speed of 10 to 100 times can be expected. A secondary goal is to avoid Vim-specific constructs and get closer to commonly used programming languages, such as JavaScript, TypeScript and Java.

The performance improvements can only be achieved by not being 100% backwards compatible. For example, making function arguments available by creating an "a:" dictionary involves quite a lot of overhead. In a Vim9 function this dictionary is not available. Other differences are more subtle, such as how errors are handled. For those with a large collection of legacy scripts: Not to worry! They will keep working as before. There are no plans to drop support for legacy script. No drama like with the deprecation of Python 2.

Programming

The Really Important Job Interview Questions Engineers Should Ask (But Don't) (posthog.com) 185

James Hawkins: Since we started PostHog, our team has interviewed 725 people. What's one thing I've taken from this? It's normal for candidates not to ask harder questions about our company, so they usually miss out on a chance to (i) de-risk our company's performance and (ii) to increase the chances they'll like working here.

Does the company have product-market fit? This is the single most important thing a company can do to survive and grow.
"Do you ever question if you have product-market fit?"
"When did you reach product-market fit? How did you know?"
"What do you need to do to get to product-market fit?"
"What's your revenue? What was it a year ago?"
"How many daily active users do you have?"

It's ok if these answers show you the founder doesn't have product market fit. In this case, figure out if they will get to a yes. Unless you want to join a sinking ship, of course! Early stage founders are (or should be) super-mega-extra-desperately keen to have product-market fit -- it's all that really matters. The ones that will succeed are those that are honest about this (or those that have it already) and are prioritizing it. Many will think or say (intentionally or through self-delusion) that they have it when they don't. Low user or revenue numbers and vague answers to the example questions above are a sign that it isn't there. Product-market fit is very obvious.

Google

Google Launches Advanced API Security To Protect APIs From Growing Threats (techcrunch.com) 6

Google today announced a preview of Advanced API Security, a new product headed to Google Cloud that's designed to detect security threats as they relate to APIs. TechCrunch reports: Built on Apigee, Google's platform for API management, the company says that customers can request access starting today. Short for "application programming interface," APIs are documented connections between computers or between computer programs. API usage is on the rise, with one survey finding that more than 61.6% of developers relied on APIs more in 2021 than in 2020. But they're also increasingly becoming the target of attacks. According to a 2018 report commissioned by cybersecurity vendor Imperva, two-thirds of organizations are exposing unsecured APIs to the public and partners.

Advanced API Security specializes in two tasks: identifying API misconfigurations and detecting bots. The service regularly assesses managed APIs and provides recommended actions when it detects configuration issues, and it uses preconfigured rules to provide a way to identify malicious bots within API traffic. Each rule represents a different type of unusual traffic from a single IP address; if an API traffic pattern meets any of the rules, Advanced API Security reports it as a bot. [...] With the launch of Advanced API Security, Google is evidently seeking to bolster its security offerings under Apigee, which it acquired in 2016 for over half a billion dollars. But the company is also responding to increased competition in the API security segment.
"Misconfigured APIs are one of the leading reasons for API security incidents. While identifying and resolving API misconfigurations is a top priority for many organizations, the configuration management process is time consuming and requires considerable resources," Vikas Ananda, head of product at Google Cloud, said in a blog post shared with TechCrunch ahead of the announcement. "Advanced API Security makes it easier for API teams to identify API proxies that do not conform to security standards... Additionally, Advanced API Security speeds up the process of identifying data breaches by identifying bots that successfully resulted in the HTTP 200 OK success status response code."
Businesses

FBI Says People Are Using Deepfakes To Apply To Remote Jobs (gizmodo.com) 47

An anonymous reader quotes a report from Gizmodo: The FBI wrote to its Internet Crime Complaint Center Tuesday that it has received multiple complaints of people using stolen information and deepfaked video and voice to apply to remote tech jobs. According to the FBI's announcement, more companies have been reporting people applying to jobs using video, images, or recordings that are manipulated to look and sound like somebody else. These fakers are also using personal identifiable information from other people -- stolen identities -- to apply to jobs at IT, programming, database, and software firms. The report noted that many of these open positions had access to sensitive customer or employee data, as well as financial and proprietary company info, implying the imposters could have a desire to steal sensitive information as well as a bent to cash a fraudulent paycheck.

What isn't clear is how many of these fake attempts at getting a job were successful versus how many were caught and reported. Or, in a more nefarious hypothetical, whether someone secured an offer, took a paycheck, and then got caught. These applicants were apparently using voice spoofing techniques during online interviews where lip movement did not match what's being said during video calls, according to the announcement. Apparently, the jig was up in some of these cases when the interviewee coughed or sneezed, which wasn't picked up by the video spoofing software.
Companies who suspect a fake applicant can report it to the complaint center site.
Open Source

Linus Torvalds Is Cautiously Optimistic About Bringing Rust Into Linux Kernel's Next Release (zdnet.com) 123

slack_justyb shares a report from ZDNet: For over three decades, Linux has been written in the C programming language. Indeed, Linux is C's most outstanding accomplishment. But the last few years have seen a growing momentum to make the Rust programming language Linux's second Linux language. At the recent Open Source Summit in Austin, Texas, Linux creator Linus Torvald said he could see Rust making it into the Linux kernel as soon as the next major release. "I'd like to see the Rust infrastructure merging to be started in the next release, but we'll see." Linux said after the summit. "I won't force it, and it's not like it's going to be doing anything really meaningful at that point -- it would basically be the starting point. So, no promises."

Now, you may ask: "Why are they adding Rust at all?" Rust lends itself more easily to writing secure software. Samartha Chandrashekar, an AWS product manager, said it "helps ensure thread safety and prevent memory-related errors, such as buffer overflows that can lead to security vulnerabilities." Many other developers agree with Chandrashekar. Torvalds also agrees and likes that Rust is more memory-safe. "There are real technical reasons like memory safety and why Rust is good to get in the kernel." Mind you, no one is going to be rewriting the entire 30 or so million lines of the Linux kernel into Rust. As Linux developer Nelson Elhage said in his summary of the 2020 Linux Plumber's meeting on Rust in Linux: "They're not proposing a rewrite of the Linux kernel into Rust; they are focused only on moving toward a world where new code may be written in Rust." The three areas of potential concern for Rust support are making use of the existing APIs in the kernel, architecture support, and dealing with application binary interface (ABI) compatibility between Rust and C.

Programming

Svelte Origins: a JavaScript Documentary (youtube.com) 48

Svelte Origins: The Documentary tells the story of how Svelte came to be, what makes Svelte different, and how it changes the game as a JavaScript framework. From the description of the documentary, which was recommended by several Slashdot readers: Filmed in locations throughout Europe and the US, it features Svelte's creator Rich Harris and members from the core community who contributed to making Svelte what it is today. Svelte Origins was filmed in late 2021, produced by OfferZen and directed by Dewald Brand, with shoots in the USA, the UK, Ireland, Sweden and Germany.
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Programming

Stack Overflow Survey Finds Developers Like Rust, Python, JavaScript and Remote Work (infoworld.com) 97

For Stack Overflow's annual survey, "Over 73,000 developers from 180 countries each spent roughly 15 minutes answering our questions," a blog post announces: The top five languages for professional developers haven't changed: JavaScript is still the most used, and Rust is the most loved for a seventh year. The big surprise came in the most loved web framework category. Showing how fast web technologies change, newcomer Phoenix took the most loved spot from Svelte, itself a new entry last year.... Check out the full results from this year's Developer Survey here.
In fact, 87% of Rust developers said that they want to continue using Rust, notes SD Times' summary of the results: Rust also tied with Python as the most wanted technology in this year's report, with TypeScript and Go following closely behind. The distinction between most loved and most wanted is that most wanted includes only developers who are not currently developing with the language, but have an interest in developing with it.
Slashdot reader logankilpatrick writes, "It should come as no surprise to those following the growth and expansion of the Julia Programming Language ecosystem that in this year's Stack Overflow developer survey, Julia ranked in the top 5 for the most loved languages (above Python — 6th, MatLab — Last, and R — 33rd)."

And the Register shares more highlights: Also notable in the 71,547 responses regarding programming languages was a switch again between Python and SQL. In 2021, Python pushed out SQL to be the third most commonly used language. This year SQL regained third place, just behind second placed HTML /CSS.

And the most hated...

Unsurprisingly, developers still dread that tap on the shoulder from the finance department for a tweak to that bit of code upon which the entire company depends. Visual Basic for Applications and COBOL still lurk within the top three most dreaded technologies.

The operating system rankings were little changed: Windows won out for personal and professional use, although for professional use Linux passed macOS to take second place with 40 percent of responses compared to Apple's 33 percent. Most notable was the growth of Windows Subsystem for Linux, which now accounts for 14 percent of personal use compared with a barely registering 3 percent in 2021.

But SD Times noted what may be the most interesting statistic: Only 15% of developers work on-site full time. Forty-three percent are fully remote and 42% are hybrid. Smaller organizations with 2-19 employees are more likely to be in-person, while large organizations with over 10k employees are more likely to be hybrid, according to the survey.
InfoWorld delves into what this means: "The world has made the decision to go hybrid and remote, I have a lot of confidence given the data I have seen that that is a one-way train that has left the station," Prashanth Chandrasekar, CEO of Stack Overflow told InfoWorld.

Chandrasekar says that flexibility and the tech stack developers get to work with are the most important contributors to overall happiness at work. "Many developers drop out of the hiring process because of the tech stack they will be working with," he said... Organizational culture is also shifting, and cloud-native techniques have taken hold among Stack Overflow survey respondents. Most professional developers (70%) now use some form of CI/CD and 60% have a dedicated devops function....

Lastly, Web3 still has software developers torn, with 32% of respondents favorable, 31% unfavorable, and 26% indifferent. Web3 refers to the emerging idea of a decentralized web where data and content are registered on blockchains, tokenized, or managed and accessed on peer-to-peer distributed networks.

IT

Are 'Google Programmers' the New 'Next-Next-Finish Programmers'? (pvs-studio.com) 203

Long-time Slashdot reader theodp writes: Back in 1998, Ellen Ullman wrote in Salon about The dumbing-down of programming: "My programming tools were full of wizards. Little dialog boxes waiting for me to click "Next" and "Next" and "Finish." Click and drag and shazzam! — thousands of lines of working code. No need to get into the "hassle" of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don't really need to understand."

Twenty-four years later, PVS-Studio has published a translation of Ivan Belokamentsev's cautionary tale of how modernizing his interviewing process from coding on paper to a computer led him to inadvertently hire 'Google Programmers', who dazzled him in interviews and initially on the job, but soon reached a plateau in productivity that puzzled him until he had a gobsmacking realization.

From their article: It was like somebody hit me on the head with a sack of flour. It took me about two days to process it. How is it really possible? The beautiful, well-optimized code they showed me at the first interview was from the Internet. The explosive growth of productivity in the first months was due to the solutions that they found on the Internet. Those answers to user questions after the magic "We'll call you back" from these guys — were found on the Internet. They were coding without understanding the basic constructs. No, they didn't write code — they downloaded it. No, that's not it, either. To download the code is like running "npm i", it's ok. They copy-pasted the code. Without knowing how to write it.

That's what angered me — what the...? Well, I understand when you surf the net to figure out how a new technology works. Or when you need to use some exotic feature and not to bloat your head with unnecessary information. But basic things! How can you copy-paste basic things from the Internet?!

The article meditates on the mindset of "Google" programmers. Rather than learning about basic objects, types, and the constructs of a programming language, "Any information is available to them, always and everywhere. They've learned how to find this information quickly — whether it's the address of a store with cookies, pants on sale or generating a query."

But long-time Slashdot reader AmiMoJo now pushes back: This is dumb. Not everyone has a great memory, and these days there are so many different tools and frameworks that nobody can remember them all anyway. Back in the day when it was all C, you could reasonably write useful code on paper. These days most of that code will probably be interacting with libraries that you have not committed to memory.

If your developers are not progressing, help them. Give them training or mentoring. Challenge them.

And there's also this advice from Slashdot reader Iamthecheese: "Stop selecting for low ethics in your hiring process." There is a stupid, stupid idea out there among the pointy hair types that it's possible to hire top tier candidates for peanuts. This idea has been put into their heads by massively over-promising companies selling HR solutions of all shapes... They're actively selecting people with just enough ability to pass these specific tests and who are unwilling to show their true levels of ability by hashing it out on their own. So you have these untrained people who look for easy ways past problems, but you were expecting "rock stars".
Their suggested solution? "Stop looking for easy, cheap, already trained people and start looking for trainable, people." And then, "show them a little loyalty. That way you'll have people to train new hires, who also know what they're doing on the job."
AI

AI-Powered GitHub Copilot Leaves Preview, Now Costs $100 a Year (techcrunch.com) 36

It was June 29th of 2021 that Microsoft-owned GitHub first announced its AI-powered autocompletion tool for programmers — trained on GitHub repositories and other publicly-available source code.

But after a year in "technical preview," GitHub Copilot has reached a new milestone, reports Info-Q: you'll now have to pay to use it after a 60-day trial: The transition to general availability mostly means that Copilot ceases to be available for free. Interested developers will have to pay 10 USD/month or $100 USD/year to use the service, with a 60-day free trial.... According to GitHub, while not frequent, there is definitely a possibility that Copilot outputs code snippets that match those in the training set.
Info-Q also cites GitHub stats showing over 1.2 million developers used Copilot in the last 12 months "with a shocking 40% figure of code written by Copilot in files where it is enabled." That's up from 35% earlier in the year, reports TechCrunch — which has more info on the rollout: It'll be free for students as well as "verified" open source contributors — starting with roughly 60,000 developers selected from the community and students in the GitHub Education program... One new feature coinciding with the general release of Copilot is Copilot Explain, which translates code into natural language descriptions. Described as a research project, the goal is to help novice developers or those working with an unfamiliar codebase.

Ryan J. Salva, VP of product at GitHub, told TechCrunch via email... "As an example of the impact we've observed, it's worth sharing early results from a study we are conducting. In the experiment, we are asking developers to write an HTTP server — half using Copilot and half without. Preliminary data suggests that developers are not only more likely to complete their task when using Copilot, but they also do it in roughly half the time."

Owing to the complicated nature of AI models, Copilot remains an imperfect system. GitHub said that it's implemented filters to block emails when shown in standard formats, and offensive words, and that it's in the process of building a filter to help detect and suppress code that's repeated from public repositories. But the company acknowledges that Copilot can produce insecure coding patterns, bugs and references to outdated APIs, or idioms reflecting the less-than-perfect code in its training data.

The Verge ponders where this is going — and how we got here: "Just like the rise of compilers and open source, we believe AI-assisted coding will fundamentally change the nature of software development, giving developers a new tool to write code easier and faster so they can be happier in their lives," says GitHub CEO Thomas Dohmke.

Microsoft's $1 billion investment into OpenAI, the research firm now led by former Y Combinator president Sam Altman, led to the creation of GitHub Copilot. It's built on OpenAI Codex, a descendant of OpenAI's flagship GPT-3 language-generating algorithm.

GitHub Copilot has been controversial, though. Just days after its preview launch, there were questions over the legality of Copilot being trained on publicly available code posted to GitHub. Copyright issues aside, one study also found that around 40 percent of Copilot's output contained security vulnerabilities.

Slashdot Top Deals