Businesses

IBM To Pause Hiring In Plan To Replace 7,800 Jobs With AI 129

IBM CEO Arvind Krishna told Bloomberg that it expects to pause hiring for roles as roughly 7,800 jobs could be replaced by AI in the coming years. Reuters reports: Hiring specifically in back-office functions such as human resources will be suspended or slowed, Krishna said, adding that 30% of non-customer-facing roles could be replaced by AI and automations in five years. The reduction could include not replacing roles vacated by attrition, the PC-maker told the publication.
Open Source

Red Hat's 30th Anniversary: How a Microsoft Competitor Rose from an Apartment-Based Startup (msn.com) 47

For Red Hat's 30th anniversary, North Carolina's News & Observer newspaper ran a special four-part series of articles.

In the first article Red Hat co-founder Bob Young remembers Red Hat's first big breakthrough: winning InfoWorld's "OS of the Year" award in 1998 — at a time when Microsoft's Windows controlled 85% of the market. "How is that possible," Young said, "that one of the world's biggest technology companies, on this strategically critical product, loses the product of the year to a company with 50 employees in the tobacco fields of North Carolina?" The answer, he would tell the many reporters who suddenly wanted to learn about his upstart company, strikes at "the beauty" of open-source software.

"Our engineering team is an order of magnitude bigger than Microsoft's engineering team on Windows, and I don't really care how many people they have," Young would say. "Like they may have thousands of the smartest operating system engineers that they could scour the planet for, and we had 10,000 engineers by comparison...."

Young was a 40-year-old Canadian computer equipment salesperson with a software catalog when he noticed what Marc Ewing was doing. [Ewing was a recent college graduate bored with his two-month job at IBM, selling customized Linux as a side hustle.] It's pretty primitive, but it's going in the right direction, Young thought. He began reselling Ewing's Red Hat product. Eventually, he called Ewing, and the two met at a tech conference in New York City. "I needed a product, and Marc needed some marketing help," said Young, who was living in Connecticut at the time. "So we put our two little businesses together."

Red Hat incorporated in March 1993, with the earliest employees operating the nascent business out of Ewing's Durham apartment. Eventually, the landlord discovered what they were doing and kicked them out.

The four articles capture the highlights. ("A visual effects group used its Linux 4.1 to design parts of the 1997 film Titanic.") And it doesn't leave out Red Hat's skirmishes with Microsoft. ("Microsoft was owned by the richest person in the world. Red Hat engineers were still linking servers together with extension cords. ") "We were changing the industry and a lot of companies were mad at us," says Michael Ferris, Red Hat's VP of corporate development/strategy. Soon there were corporate partnerships with Netscape, Intel, Hewlett-Packard, Compaq, Dell, and IBM — and when Red Hat finally goes public in 1999, its stock sees the eighth-largest first-day gain in Wall Street history, rising in value in days to over $7 billion and "making overnight millionaires of its earliest employees."

But there's also inspiring details like the quote painted on the wall of Red Hat's headquarters in Durham: "Every revolution was first a thought in one man's mind; and when the same thought occurs to another man, it is the key to that era..." It's fun to see the story told by a local newspaper, with subheadings like "It started with a student from Finland" and "Red Hat takes on the Microsoft Goliath."

Something I'd never thought of. 2001's 9/11 terrorist attack on the World Trade Center "destroyed the principal data centers of many Wall Street investment banks, which were housed in the twin towers. With their computers wiped out, financial institutions had to choose whether to rebuild with standard proprietary software or the emergent open source. Many picked the latter." And by the mid-2000s, "Red Hat was the world's largest provider of Linux...' according to part two of the series. "Soon, Red Hat was servicing more than 90% of Fortune 500 companies." By then, even the most vehement former critics were amenable to Red Hat's kind of software. Microsoft had begun to integrate open source into its core operations. "Microsoft was on the wrong side of history when open source exploded at the beginning of the century, and I can say that about me personally," Microsoft President Brad Smith later said.

In the 2010s, "open source has won" became a popular tagline among programmers. After years of fighting for legitimacy, former Red Hat executives said victory felt good. "There was never gloating," Tiemann said.

"But there was always pride."

In 2017 Red Hat's CEO answered questions from Slashdot's readers.
Red Hat Software

Red Hat Begins Cutting 'Hundreds of Jobs' (phoronix.com) 49

According to Phoronix citing multiple local North Carolina news outlets, Red Hat is cutting "hundreds of jobs" in an initial round of layoffs announced today. From the report: According to WRAL, Red Hat CEO Matt Hicks is said to have told employees in an email "we will not reduce roles directly selling to customers or building our products," which is hopefully good news for their many upstream Linux developers they employ that ultimately build Red Hat Enterprise Linux and associated software products. Red Hat will begin notifying affected employees today in some countries while the process will continue through the end of the quarter. IBM, which acquired Red Hat in 2019, has already slashed some five thousand positions so far in 2023.
Virtualization

QEMU 8.0 Released with More ARM and RISC-V Emulation (9to5linux.com) 23

There's a major new update of QEMU, the open-source machine emulator, reports 9to5Linux: Coming a year after QEMU 7.0, the QEMU 8.0 release is here to improve support for ARM and RISC-V architectures.

- For ARM, it adds emulation support for FEAT_EVT, FEAT_FGT, and AArch32 ARMv8-R, CPU emulation for Cortex-A55 and Cortex-R52, support for a new Olimex STM32 H405 machine type, as well as gdbstub support for M-profile system registers.

- For the RISC-V architecture, QEMU 8.0 brings updated machine support for OpenTitan, PolarFire, and OpenSBI, additional ISA and Extension support for smstateen, native debug icount trigger, cache-related PMU events in virtual mode, Zawrs/Svadu/T-Head/Zicond extensions, and ACPI support. Moreover, RISC-V received multiple fixes covering PMP propagation for TLB, mret exceptions, uncompressed instructions, and other emulation/virtualization improvements.

Improvements were also made for the s390x (IBM Z) platform, the HP Precision Architecture (HPPA) platform, and x86.
United States

A Tech Industry Pioneer Sees a Way for the US To Lead in Advanced Chips (nytimes.com) 17

Ivan Sutherland played a key role in foundational computer technologies. Now he sees a path for America to claim the mantle in "superconducting" chips. From a report: It has been six decades since Ivan Sutherland created Sketchpad, a software system that foretold the future of interactive and graphical computing. In the 1970s, he played a role in rallying the computer industry to build a new type of microchip with hundreds of thousands of circuits that would become the foundation of today's semiconductor industry. Now Dr. Sutherland, who is 84, believes the United States is failing at a crucial time to consider alternative chip-making technologies that would allow the country to reclaim the lead in building the most advanced computers.

By relying on supercooled electronic circuits that switch without electrical resistance and as a consequence generate no excess heat at higher speeds, computer designers will be able to circumvent the greatest technological barrier to faster machines, he claims. "The nation that best seizes the superconducting digital circuit opportunity will enjoy computing superiority for decades to come," he and a colleague recently wrote in an essay that circulated among technologists and government officials. Dr. Sutherland's insights are significant partly because decades ago he was instrumental in helping to create today's dominant approach to making computer chips.

In the 1970s, Dr. Sutherland, who was chairman of the computer science department at the California Institute of Technology, and his brother Bert Sutherland, then a research manager at a division of Xerox called the Palo Alto Research Center, introduced the computer scientist Lynn Conway to the physicist Carver Mead. They pioneered a design based on a type of transistor, known as complementary metal-oxide semiconductor, or CMOS, which was invented in the United States. It made it possible to manufacture the microchips used by personal computers, video games and the vast array of business, consumer and military products. Now Dr. Sutherland is arguing that an alternative technology that predates CMOS, and has had many false starts, should be given another look. Superconducting electronics was pioneered at the Massachusetts Institute of Technology in the 1950s and then pursued by IBM in the 1970s before being largely abandoned. At one point, it even made an odd international detour before returning to the United States.

IBM

GlobalFoundries Sues IBM, Says Trade Secrets Were Unlawfully Given To Japan's Rapidus (reuters.com) 4

Chip manufacturer GlobalFoundries said today it had filed a lawsuit against IBM, accusing it of unlawfully sharing confidential intellectual property and trade secrets. From a report: New York-based GlobalFoundries said in its complaint that IBM had shared IP and trade secrets with Rapidus, a new state-backed Japanese consortium that IBM is working with to develop and produce cutting-edge two-nanometre chips. It also asserted that IBM had unlawfully disclosed and misused its IP with Intel, noting that IBM had announced in 2021 it would collaborate with Intel on next-generation chip technology. "IBM is unjustly receiving potentially hundreds of millions of dollars in licensing income and other benefits," GlobalFoundries said in a statement. IBM pushed back in an emailed statement to Reuters saying: "GlobalFoundries filed this meritless lawsuit after a court rejected the company's attempt to dismiss IBM's legitimate fraud and breach of contract claims. Their allegations are entirely baseless, and we are confident that the court will agree."
Education

Should Managers Permanently Stop Requiring Degrees for IT Positions? (cio.com) 214

CIO magazine reports on "a growing number of managers and executives dropping degree requirements from job descriptions." Figures from the 2022 study The Emerging Degree Reset from The Burning Glass Institute quantify the trend, reporting that 46% of middle-skill and 31% of high-skill occupations experienced material degree resets between 2017 and 2019. Moreover, researchers calculated that 63% of those changes appear to be "'structural resets' representing a measured and potentially permanent shift in hiring practices" that could make an additional 1.4 million jobs open to workers without college degrees over the next five years.

Despite such statistics and testimony from Taylor and other IT leaders, the debate around whether a college education is needed in IT isn't settled. Some say there's no need for degrees; others say degrees are still preferred or required.... IBM is among the companies whose leaders have moved away from degree requirements; Big Blue is also one of the earliest, largest, and most prominent proponents of the move, introducing the term "new collar jobs" for the growing number of positions that require specific skills but not a bachelor's degree....

Not all are convinced that dropping degree requirements is the way to go, however. Jane Zhu, CIO and senior vice president at Veritas Technologies, says she sees value in degrees, value that isn't always replicated through other channels. "Though we don't necessarily require degrees for all IT roles here at Veritas, I believe that they do help candidates demonstrate a level of formal education and commitment to the field and provide a foundation in fundamental concepts and theories of IT-related fields that may not be easily gained through self-study or on-the-job training," she says. "Through college education, candidates have usually acquired basic technical knowledge, problem-solving skills, the ability to collaborate with others, and ownership and accountability. They also often gain an understanding of the business and social impacts of their actions."

The article notes an evolving trend of "more openness to skills-based hiring for many technical roles but a desire for a bachelor's degree for certain positions, including leadership." (Kelli Jordan, vice president of IBMer Growth and Development tells CIO that more than half of the job openings posted by IBM no longer require degrees.)

Thanks to Slashdot reader snydeq for sharing the article.
Red Hat Software

Biggest Linux Company of Them All Still Pushing To Become Cloud Power (theregister.com) 23

An anonymous reader shares a report: For Red Hat, which turned 30 on March 27, it was a cause for celebration. From a business that got started in one of its co-founder's wife's sewing room, it became the first billion-dollar pure-play open-source company and then the engine driving IBM. It has been a long strange trip. Sure, today, the tech world is dominated by Linux and open source software, but in 1993, Linux was merely an obscure operating system known only to enthusiasts. Red Hat played a significant role in transforming the "just a hobby" operating system into today's major IT powerhouse. Red Hat co-founder Bob Young, who previously ran a rental typewriter business, was one of those who became intrigued by Linux. In 1993, he established ACC Corporation, a catalog company that distributed Slackware Linux CDs and open-source software.

[...] In 2003, Paul Cormier, then Red Hat's vice president of engineering and now the company's chairman, spearheaded the shift from the inexpensive prosumer Red Hat Linux distribution to the full business-oriented Red Hat Enterprise Linux (RHEL). At the time, many Linux users hated the idea. Even inside Red Hat, Cormier said that many engineers were initially opposed to the new business model, causing some to leave the company while others stayed. The change also upset many users who felt Red Hat was abandoning its original customers. However, enterprise clients had a different perspective. Whitehurst, who became Red Hat CEO in 2008, said, "Once RHEL was in the market, we had to fully support it to make it truly consumable for the enterprise." They succeeded, and Red Hat continued to grow. This is the model that turned Red Hat into the first billion-dollar-a-quarter pure open-source company. Impressive for a business built around an operating system once considered suitable only for the "lunatic fringe." Then, in 2018, IBM acquired Red Hat for a cool $34 billion. There was nothing crazy about that move.

[...] Another change that was already present in Red Hat, a shift towards supporting the cloud, has accelerated. Today, while RHEL remains the heart of the business, the Linux-powered cloud has become increasingly important. In particular, Red Hat OpenShift, its Kubernetes-powered hybrid cloud application platform, is more important than ever. Where does Red Hat go from here? When I last talked to Cormier and Red Hat's latest CEO, Matt Hicks, they told me that they'd keep moving forward with the hybrid cloud. After all, as Cormier pointed out, "the cloud wouldn't be here" without Linux and open source. As for Red Hat's relationship with IBM, Cormier said, "The red lines were red, and the blue lines were blue, and that will stay the same."

IBM

IBM Explores Sale of Weather Business (wsj.com) 32

International Business Machines is exploring a sale of its weather operation, WSJ reported, citing people familiar with the matter, as the technology company seeks to streamline its operations. From a report: An auction of the business is at an early stage, the people said, and there may not be a deal. Should there be one, private-equity is most likely the buyer in a deal that could be valued at more than $1 billion, the people said. IBM agreed to buy the business in 2015, purchasing The Weather Company's business-to-business, mobile and cloud-based businesses including Weather.com, which provides weather forecasts around the globe. The deal price at the time was pegged at more than $2 billion.

The Weather Channel wasn't part of the deal, but agreed to license weather-forecast data and analytics from IBM. The deal was part of a push by IBM to use its cloud infrastructure to provide accurate weather forecasts and help companies control costs. The business issues more than 25 billion forecasts a day, according to the company's website. A sale of the weather unit would be a part of a broader push by IBM to streamline its operations as the once-dominant company's shares languish near levels they traded at more than 20 years ago.

IBM

New Models of IBM Model F Keyboard Mark II Incoming (theregister.com) 46

An anonymous reader quotes a report from Ars Technica: What's even harder-core than the IBM Model M? The Model F, the keyboard that launched alongside the IBM PC in 1981. After a 2017 relaunch, new models with the original layout are here. The project, which back in 2017 relaunched a modern keyboard inspired by a compact space-saver version of IBM's classic Model F, is launching its second generation of brand-new premium input devices, and this time, various layouts will be available. [...]

Enter the New Model F Keyboards project. "Ellipse" launched it in 2017 and attracted over $300,000 worth of orders, even at $399 each. Aside from the not-inconsiderable price, what put the author off was the layout. Space-saving and reduced-footprint keyboards are very popular among serious keyboard collectors, and the project chose two space-saver layouts from IBM's 4704 terminal, dubbed the Kishsaver after the collector who described it. The F77 layout has a numeric keypad, but no function keys; the even smaller F62 layout omits the keypad, or as the cool kids call it, it's a TKL layout, which we are informed stands for tenkeyless, presumably because it has 15 fewer keys.

Which is why the FOSS desk's bank account would tremble in fear if it were not an inanimate table in a database somewhere, because the Model F project has announced a new range, including full-size and compact 104-key layouts and most appealing to this large and heavy-handed vulture, a replica of the 122-key IBM Battleship, one of which we've been hunting for over a decade. The project occasionally has refurbished original IBM units. Now, though, a brand-new one is a $420 option. If that isn't exclusive enough, your correspondent also working on a model with beam springs, the mechanism from 1970s IBM business products. The first model of the brand new beam spring units is a mere $579.

Security

Ransomware Crooks Are Exploiting IBM File-Exchange Bug With a 9.8 Severity (arstechnica.com) 18

Threat actors are exploiting a critical vulnerability in an IBM file-exchange application in hacks that install ransomware on servers, security researchers have warned. From a report: The IBM Aspera Faspex is a centralized file-exchange application that large organizations use to transfer large files or large volumes of files at very high speeds. Rather than relying on TCP-based technologies such as FTP to move files, Aspera uses IBM's proprietary FASP -- short for Fast, Adaptive, and Secure Protocol -- to better utilize available network bandwidth. The product also provides fine-grained management that makes it easy for users to send files to a list of recipients in distribution lists or shared inboxes or workgroups, giving transfers a workflow that's similar to email.

In late January, IBM warned of a critical vulnerability in Aspera versions 4.4.2 Patch Level 1 and earlier and urged users to install an update to patch the flaw. Tracked as CVE-2022-47986, the vulnerability makes it possible for unauthenticated threat actors to remotely execute malicious code by sending specially crafted calls to an outdated programming interface. The ease of exploiting the vulnerability and the damage that could result earned CVE-2022-47986 a severity rating of 9.8 out of a possible 10. On Tuesday, researchers from security firm Rapid7 said they recently responded to an incident in which a customer was breached using the vulnerability.

AI

Developer Builds a ChatGPT Client for MS-DOS (yeokhengmeng.com) 54

"With the recent attention on ChatGPT and OpenAI's release of their APIs, many developers have developed clients for modern platforms to talk to this super smart AI chatbot," writes maker/retro coding enthusiast yeokm1 . "However I'm pretty sure almost nobody has written one for a vintage platform like MS-DOS."

They share a blog post with all the details — including footage of their client ultimately running on a vintage IBM PC from 1984 (with a black and orange monitor and those big, boxy keys). "3.5 years ago, I wrote a Slack client to run on Windows 3.1," the blog post explains. "I thought to try something different this time and develop for an even older platform as a challenge."

One challenge was just finding a networking API for DOS. But everything came together, with the ChatGPT-for-DOS app written using Visual Studio Code text editor (testing on a virtual machine running DOS 6.22), parsing the JSON output from OpenAI's Chat Completion API. "And before you ask, I did not use ChatGPT for help to code this app in any way," the blog post concludes. But after the app was working, he used it to ask ChatGPT how one would build such an app — and ChatGPT erroneously suggested breezily that he just try accessing OpenAI's Python API from the DOS command line.

"What is the AI smoking...?"
IBM

IBM Installs World's First Quantum Computer for Accelerating Healthcare Research (insidehpc.com) 44

It's one of America's best hospitals — a nonprofit "academic medical center" called the Cleveland Clinic. And this week it installed an IBM-managed quantum computer to accelerate healthcare research (according to an announcement from IBM). IBM is calling it "the first quantum computer in the world to be uniquely dedicated to healthcare research."

The clinic's CEO said the technology "holds tremendous promise in revolutionizing healthcare and expediting progress toward new cares, cures and solutions for patients." IBM's CEO added that "By combining the power of quantum computing, artificial intelligence and other next-generation technologies with Cleveland Clinic's world-renowned leadership in healthcare and life sciences, we hope to ignite a new era of accelerated discovery."

em>Inside HPC points out that "IBM Quantum System One" is part of a larger biomedical research program applying high-performance computing, AI, and quantum computing, with IBM and the Cleveland Clinic "collaborating closely on a robust portfolio of projects with these advanced technologies to generate and analyze massive amounts of data to enhance research." The Cleveland Clinic-IBM Discovery Accelerator has generated multiple projects that leverage the latest in quantum computing, AI and hybrid cloud to help expedite discoveries in biomedical research. These include:

- Development of quantum computing pipelines to screen and optimize drugs targeted to specific proteins;

- Improvement of a quantum-enhanced prediction model for cardiovascular risk following non-cardiac surgery;

- Application of artificial intelligence to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimer's and other diseases.


The Discovery Accelerator also serves as the technology foundation for Cleveland Clinic's Global Center for Pathogen & Human Health Research, part of the Cleveland Innovation District. The center, supported by a $500 million investment from the State of Ohio, Jobs Ohio and Cleveland Clinic, brings together a team focused on studying, preparing and protecting against emerging pathogens and virus-related diseases. Through the Discovery Accelerator, researchers are leveraging advanced computational technology to expedite critical research into treatments and vaccines.

Earth

Chipmakers Fight Spread of US Crackdowns on 'Forever Chemicals' 37

Intel and other semiconductor companies have joined together with industrial materials businesses to fight US clampdowns on "forever chemicals," substances used in myriad products that are slow to break down in the environment. From a report: The lobbying push from chipmakers broadens the opposition to new rules and bans for the chemicals known as PFAS. The substances have been found in the blood of 97 per cent of Americans, according to the US government. More than 30 US states this year are considering legislation to address PFAS, according to Safer States, an environmental advocacy group. Bills in California and Maine passed in 2022 and 2021, respectively.

"I think clean drinking water and for farmers to be able to irrigate their fields is far more important than a microchip," said Stacy Brenner, a Maine state senator who backed the state's bipartisan legislation. In Minnesota, bills would ban by 2025 certain products that contain added PFAS -- which is short for perfluoroalkyl and polyfluoroalkyl substances -- in legislation considered to be some of the toughest in the country. The Semiconductor Industry Association -- whose members include Intel, IBM and Nvidia -- has cosigned letters opposing the Minnesota legislation, arguing its measures are overly broad and could prohibit thousands of products, including electronics. Chipmakers also opposed the California and Maine laws.
Programming

Programming Pioneer Grady Booch on Functional Programming, Web3, and Conscious Machines (infoworld.com) 76

InfoWorld interviews Grady Booch, chief scientist for software engineering at IBM Research (who is also a pioneer in design patterns, agile methods, and one of the creators of UML).

Here's some of the highlights: Q: Let me begin by asking something "of the moment." There has been an almost cultural war between object-oriented programming and functional programming. What is your take on this?

Booch: I had the opportunity to conduct an oral history with John Backus — one of the pioneers of functional programming — in 2006 on behalf of the Computer History Museum. I asked John why functional programming didn't enter the mainstream, and his answer was perfect: "Functional programming makes it easy to do hard things" he said, "but functional programming makes it very difficult to do easy things...."


Q: Would you talk a bit about cryptography and Web3?

Booch: Web3 is a flaming pile of feces orbiting a giant dripping hairball. Cryptocurrencies — ones not backed by the full faith and credit of stable nation states — have only a few meaningful use cases, particularly if you are a corrupt dictator of a nation with a broken economic system, or a fraud and scammer who wants to grow their wealth at the expense of greater fools. I was one of the original signatories of a letter to Congress in 2022 for a very good reason: these technologies are inherently dangerous, they are architecturally flawed, and they introduce an attack surface that threatens economies....


Q: What do you make of transhumanism?

Booch: It's a nice word that has little utility for me other than as something people use to sell books and to write clickbait articles....


Q: Do you think we'll ever see conscious machines? Or, perhaps, something that compels us to accept them as such?

Booch: My experience tells me that the mind is computable. Hence, yes, I have reason to believe that we will see synthetic minds. But not in my lifetime; or yours; or your children; or your children's children. Remember, also, that this will likely happen incrementally, not with a bang, and as such, we will co-evolve with these new species.

Software

Ask Slashdot: What Exactly Are 'Microservices'? 288

After debating the term in a recent Slashdot subthread, longtime reader Tablizer wants to pose the question to a larger audience: what exactly are 'microservices'? Over the past few years I've asked many colleagues what "microservices" are, and get a gazillion different answers. "Independent deploy-ability" has been an issue as old as the IBM hills. Don't make anything "too big" nor "too small"; be it functions, files, apps, name-spaces, tables, databases, etc.

Overly large X's didn't need special terms, such as "monofunction". We'd just call it "poorly partitioned/sized/factored". (Picking the right size requires skill and experience, both in technology and the domain.) Dynamic languages are usually "independently deployable" at the file level, so what is a PHP "monolith", for example?

Puzzles like this are abound when trying to use the Socratic method to tease out specific-ness. Socrates would quit and become a goat herder, as such discussions often turn sour and personal. Here's a recent Slashdot subthread debating the term.
AI

AI's Victories In Go Inspire Better Human Game Playing (scientificamerican.com) 14

Emily Willingham writes via Scientific American: In 2016 a computer named AlphaGo made headlines for defeating then world champion Lee Sedol at the ancient, popular strategy game Go. The "superhuman" artificial intelligence, developed by Google DeepMind, lost only one of the five rounds to Sedol, generating comparisons to Garry Kasparov's 1997 chess loss to IBM's Deep Blue. Go, which involves players facing off by moving black and white pieces called stones with the goal of occupying territory on the game board, had been viewed as a more intractable challenge to a machine opponent than chess. Much agonizing about the threat of AI to human ingenuity and livelihood followed AlphaGo's victory, not unlike what's happening right now with ChatGPT and its kin. In a 2016 news conference after the loss, though, a subdued Sedol offered a comment with a kernel of positivity. "Its style was different, and it was such an unusual experience that it took time for me to adjust," he said. "AlphaGo made me realize that I must study Go more."

At the time European Go champion Fan Hui, who'd also lost a private round of five games to AlphaGo months earlier, told Wired that the matches made him see the game "completely differently." This improved his play so much that his world ranking "skyrocketed," according to Wired. Formally tracking the messy process of human decision-making can be tough. But a decades-long record of professional Go player moves gave researchers a way to assess the human strategic response to an AI provocation. A new study now confirms that Fan Hui's improvements after facing the AlphaGo challenge weren't just a singular fluke. In 2017, after that humbling AI win in 2016, human Go players gained access to data detailing the moves made by the AI system and, in a very humanlike way, developed new strategies that led to better-quality decisions in their game play. A confirmation of the changes in human game play appear in findings published on March 13 in the Proceedings of the National Academy of Sciences USA.

The team found that before AI beat human Go champions, the level of human decision quality stayed pretty uniform for 66 years. After that fateful 2016-2017 period, decision quality scores began to climb. Humans were making better game play choices -- maybe not enough to consistently beat superhuman AIs but still better. Novelty scores also shot up after 2016-2017 from humans introducing new moves into games earlier during the game play sequence. And in their assessment of the link between novel moves and better-quality decisions, [the researchers] found that before AlphaGo succeeded against human players, humans' novel moves contributed less to good-quality decisions, on average, than nonnovel moves. After these landmark AI wins, the novel moves humans introduced into games contributed more on average than already known moves to better decision quality scores.

Crime

Does IceFire Ransomware Portend a Broader Shift From Windows to Linux? (darkreading.com) 28

An anonymous reader shares this report from Dark Reading: In recent weeks, hackers have been deploying the "IceFire" ransomware against Linux enterprise networks, a noted shift for what was once a Windows-only malware.

A report from SentinelOne suggests that this may represent a budding trend. Ransomware actors have been targeting Linux systems more than ever in cyberattacks in recent weeks and months, notable not least because "in comparison to Windows, Linux is more difficult to deploy ransomware against, particularly at scale," Alex Delamotte, security researcher at SentinelOne, tells Dark Reading....

"[M]any Linux systems are servers," Delamotte points out, "so typical infection vectors like phishing or drive-by download are less effective." So instead, recent IceFire attacks have exploited CVE-2022-47986 — a critical remote code execution (RCE) vulnerability in the IBM Aspera data transfer service, with a CVSS rating of 9.8.

Delamotte posits a few reasons for why more ransomware actors are choosing Linux as of late. For one thing, she says, "Linux-based systems are frequently utilized in enterprise settings to perform crucial tasks such as hosting databases, Web servers, and other mission-critical applications. Consequently, these systems are often more valuable targets for ransomware actors due to the possibility of a larger payout resulting from a successful attack, compared to a typical Windows user."

A second factor, she guesses, "is that some ransomware actors may perceive Linux as an unexploited market that could yield a higher return on investment."

While previous reports had IceFire targetting tech companies, SentinelLabs says they've seen recent attacks against organizations "in the media and entertainment sector," impacting victims "in Turkey, Iran, Pakistan, and the United Arab Emirates, which are typically not a focus for organized ransomware actors."
IBM

The SCO Lawsuit: Looking Back 20 Years Later (lwn.net) 105

"On March 7, 2003, a struggling company called The SCO Group filed a lawsuit against IBM," writes LWN.net, "claiming that the success of Linux was the result of a theft of SCO's technology..."

Two decades later, "It is hard to overestimate how much the community we find ourselves in now was shaped by a ridiculous lawsuit 20 years ago...." It was the claim of access to Unix code that was the most threatening allegation for the Linux community. SCO made it clear that, in its opinion, Linux was stolen property: "It is not possible for Linux to rapidly reach UNIX performance standards for complete enterprise functionality without the misappropriation of UNIX code, methods or concepts". To rectify this "misappropriation", SCO was asking for a judgment of at least $1 billion, later increased to $5 billion. As the suit dragged on, SCO also started suing Linux users as it tried to collect a tax for use of the system.

Though this has never been proven, it was widely assumed at the time that SCO's real objective was to prod IBM into acquiring the company. That would have solved SCO's ongoing business problems and IBM, for rather less than the amount demanded in court, could have made an annoying problem go away and also lay claim to the ownership of Unix — and, thus, Linux. To SCO's management, it may well have seemed like a good idea at the time. IBM, though, refused to play that game; the company had invested heavily into Linux in its early days and was uninterested in allowing any sort of intellectual-property taint to attach to that effort. So the company, instead, directed its not inconsiderable legal resources to squashing this attack. But notably, so did the development community as a whole, as did much of the rest of the technology industry.

Over the course of the following years — far too many years — SCO's case fell to pieces. The "misappropriated" technology wasn't there. Due to what must be one of the worst-written contracts in technology-industry history, it turned out that SCO didn't even own the Unix copyrights it was suing over. The level of buffoonery was high from the beginning and got worse; the company lost at every turn and eventually collapsed into bankruptcy.... Microsoft, which had not yet learned to love Linux, funded SCO and loudly bought licenses from the company. Magazines like Forbes were warning the "Linux-loving crunchies in the open-source movement" that they "should wake up". SCO was suggesting a license fee of $1,399 — per-CPU — to run Linux.... Such an effort, in less incompetent hands, could easily have damaged Linux badly.

As it went, SCO, despite its best efforts, instead succeeded in improving the position of Linux — in development, legal, and economic terms — considerably.

The article argues SCO's lawsuit ultimately proved that Linux didn't contain copyrighted code "in a far more convincing way than anybody else could have." (And the provenance of all Linux code contributions are now carefully documented.) The case also proved the need for lawyers to vigorously defend the rights of open source programmers. And most of all, it revealed the Linux community was widespread and committed.

And "Twenty years later, it is fair to say that Linux is doing a little better than The SCO Group. Its swaggering leader, who thought to make his fortune by taxing Linux, filed for personal bankruptcy in 2020."
Open Source

Who Writes Linux and Open Source Software? (theregister.com) 60

From an opinion piece in the Register: Aiven, an open source cloud data platform company, recently analyzed who's doing what with GitHub open source code projects. They found that the top open source contributors were all companies — Amazon Web Services, Intel, Red Hat, Google, and Microsoft....

Aiven looked at three metrics within the GitHub archives. These were the number of contributors, repositories (projects) contributed to, and the number of commits made by the contributors. These were calculated using Google Big Query analysis of PushEvents on public GitHub data. The company found that Microsoft and Google were neck-and-neck for the top spot. Red Hat is in third place, followed by Intel, then AWS, just ahead of IBM.... Red Hat is following closely behind and is currently contributing more commits than Google, with 125,012 in Q4 2022 compared to Google's 94,961. Microsoft is ahead of both, with 128,247 commits. However, regarding contributed staff working on projects, Google is leading the way with 5,757 compared to Microsoft's 5,513 and Red Hat's 3,656....

Heikki Nousiainen, Aiven CTO and co-founder, commented: "An unexpected result of our research was seeing Amazon overtake IBM to become the fifth biggest contributor." They "came late to the open source party, but they're now doubling down on its open source commitments and realizing the benefits that come with contributing to the open source projects its customers use." So, yes, open source certainly started with individual contributors, but today, and for many years before, it's company employees that are really making the code....

Aiven is far from the only one to have noticed that companies are now open source's economic engine. Jonathan Corbet, editor-in-chief of Linux Weekly News (LWN), found in his most recent analysis of Long Term Support Linux Kernel releases from 5.16 to 6.1 that a mere 7.5 percent of the kernel development, as measured by lines changed, came from individual developers. No, the real leaders were, in order: AMD; Intel; Google; Linaro, the main Arm Linux development organization; Meta; and Red Hat.

The article also includes this thought-provoking quote from Aiven CTO's. "Innovation is at the heart of the open source community, but without a strong commitment from companies, the whole system will struggle.

"We can see that companies are recognizing their role and supporting all who use open source."

Slashdot Top Deals