×
IBM

IBM Top Brass Accused Again of Using Mainframes To Prop Up Watson, Cloud Sales (theregister.com) 23

IBM, along with 13 of its current and former executives, has been sued by investors who claim the IT giant used mainframe sales to fraudulently prop up newer, more trendy parts of its business. The Register reports: In effect, IBM deceived the market about its progress in developing Watson, cloud technologies, and other new sources of revenue, by deliberately misclassifying the money it was making from mainframe deals, assigning that money instead to other products, it is alleged. The accusations emerged in a lawsuit [PDF] filed late last week against IBM in New York on behalf of the June E Adams Irrevocable Trust. It alleged Big Blue shifted sales by its "near-monopoly" mainframe business to its newer and less popular cloud, analytics, mobile, social, and security products (CAMSS), which bosses promoted as growth opportunities and designated "Strategic Imperatives."

IBM is said to have created the appearance of demand for these Strategic Imperative products by bundling them into three- to five-year mainframe Enterprise License Agreements (ELA) with large banking, healthcare, and insurance company customers. In other words, it is claimed, mainframe sales agreements had Strategic Imperative products tacked on to help boost the sales performance of those newer offerings and give investors the impression customers were clamoring for those technologies from IBM. "Defendants used steep discounting on the mainframe part of the ELA in return for the customer purchasing catalog software (i.e. Strategic Imperative Revenue), unneeded and unused by the customer," the lawsuit stated.

IBM is also alleged to have shifted revenue from its non-strategic Global Business Services (GBS) segment to Watson, a Strategic Imperative in the CAMSS product set, to convince investors that the company was successfully expanding beyond its legacy business. Last April the plaintiff Trust filed a similar case, which was joined by at least five other law firms representing other IBM shareholders. A month prior, the IBM board had been presented with a demand letter from shareholders to investigate the above allegations. Asked whether any action has been taken as a result of that letter, IBM has yet to respond.

IBM

IBM Shifts Remaining US-Based AIX Dev Jobs To India 77

According to The Register, IBM has shifted the roles of US IBM Systems employees developing AIX over to the Indian office. From the report: Prior to this transition, said to taken place in the third quarter of 2022, AIX development was split more or less evenly between the US and India, an IBM source told The Register. With the arrival of 2023, the entire group had been moved to India. Roughly 80 US-based AIX developers were affected, our source estimates. We're told they were "redeployed," and given an indeterminate amount of time to find a new position internally, in keeping with practices we reported last week based on claims by other IBM employees.

Evidently, the majority of those redeployed found jobs elsewhere at IBM. A lesser number of staff are evidently stuck in "redeployment limbo," with no IBM job identified and no evident prospects at the company. "It also appears that these people in 'redeployment' limbo within IBM are all older, retirement eligible employees," our source said. "The general sense among my peers is that redeployment is being used to nudge older employees out of the company and to do so in a manner that avoids the type of scrutiny that comes with layoffs."

Layoffs generally come with a severance payment and may have reporting requirements. Redeployments -- directing workers to find another internal position, which may require relocating -- can avoid cost and bureaucracy. They also have the potential to encourage workers to depart on their own. We're told that IBM does not disclose redeployment numbers to its employees and does not report how internal jobs were obtained -- through internal search, with the assistance of management -- or were not obtained -- employees left in limbo or who choose to leave rather than wait.
IBM

IBM Staff Grumble Redeployment Orders Are Stealth Layoffs (theregister.com) 55

IBM CEO Arvind Krishna told employees last year that he had no plans for further layoffs. But according to current IBM employees, managers continue to face pressure to reduce headcount and are trying to do without Resource Actions -- what Big Blue calls formal layoffs. The Register: Instead, they're trying to encourage employees to leave on their own through redeployment and eliminating jobs without formally doing so. An IBM employee who asked not to be identified and has been with the company for more than two decades told The Register that multiple people in part of the Systems group (the individual and four colleagues) had been "redeployed to look for another job within IBM."

These individuals are expected to continue in their jobs for an indeterminate period while using some work time to find and apply for another internal position -- which may or may not be available, or may require relocation. No end date was specified for the job search but our source suggested that affected individuals have until the end of Q1 2023. After a redeployed employee fails to find another internal position, Redeployment Initiative may become a Resource Action -- a layoff.

Encryption

Chinese Researchers Claim To Find Way To Break Encryption Using Quantum Computers (ft.com) 50

Computer security experts were struggling this week to assess a startling claim by Chinese researchers that they have found a way to break the most common form of online encryption [the link may be paywalled] using the current generation of quantum computers, years before the technology was expected to pose a threat. Financial Times: The method, outlined in a scientific paper [PDF] published in late December, could be used to break the RSA algorithm that underpins most online encryption using a quantum machine with only 372 qubits -- or quantum bits, a basic unit of quantum computing -- according to the claims from 24 researchers from a number of academic bodies and state laboratories. IBM has already said that its 433 qubit Osprey system, the most powerful quantum computer to have been publicly unveiled, will be made available to its customers early this year.

If correct, the research would mark a significant moment in the history of computer security, said Roger Grimes, a computer security expert and author. "It's a huge claim," he said. "It would mean that governments could crack other governments secrets. If it's true -- a big if -- it would be a secret like out of the movies, and one of the biggest things ever in computer science." Other experts said that while the theory outlined in the research paper appeared sound, trying to apply it in practice could well be beyond the reach of today's quantum technology. "As far as I can tell, the paper isn't wrong," said Peter Shor, the Massachusetts Institute of Technology scientist whose 1994 algorithm proving that a quantum machine could defeat online encryption helped to trigger a research boom in quantum computing. Shor's method requires machines with many hundreds of thousands, or even millions, of qubits, something that many experts believe is a decade or more away.

Microsoft

The Worst-Selling Microsoft Software Product of All Time: OS/2 for the Mach 20 (microsoft.com) 127

Raymond Chen, writing for Microsoft DevBlogs: In the mid-1980's, Microsoft produced an expansion card for the IBM PC and PC XT, known as the Mach 10. In addition to occupying an expansion slot, it also replaced your CPU: You unplugged your old and busted 4.77 MHz 8088 CPU and plugged into the now-empty socket a special adapter that led via a ribbon cable back to the Mach 10 card. On the Mach 10 card was the new hotness: A 9.54 MHz 8086 CPU. This gave you a 2x performance upgrade for a lot less money than an IBM PC AT. The Mach 10 also came with a mouse port, so you could add a mouse without having to burn an additional expansion slot. Sidebar: The product name was stylized as MACH [PDF] in some product literature. The Mach 10 was a flop.

Undaunted, Microsoft partnered with a company called Portable Computer Support Group to produce the Mach 20, released in 1987. You probably remember the Portable Computer Support Group for their disk cache software called Lightning. The Mach 20 took the same basic idea as the Mach 10, but to the next level: As before, you unplugged your old 4.77 MHz 8088 CPU and replaced it with an adapter that led via ribbon cable to the Mach 20 card, which you plugged into an expansion slot. This time, the Mach 20 had an 8 MHz 80286 CPU, so you were really cooking with gas now. And, like the Mach 10, it had a mouse port built in. According to a review in Info World, it retailed for $495. The Mach 20 itself had room for expansion: it had an empty socket for an 80287 floating point coprocessor. One daughterboard was the Mach 20 Memory Plus Expanded Memory Option, which gave you an astonishing 3.5 megabytes of RAM, and it was high-speed RAM since it wasn't bottlenecked by the ISA bus on the main motherboard. The other daughterboard was the Mach 20 Disk Plus, which lets you connect 5 1/4 or 3 1/2 floppy drives.

A key detail is that all these expansions connected directly to the main Mach 20 board, so that they didn't consume a precious expansion slot. The IBM PC came with five expansion slots, and they were in high demand. You needed one for the hard drive controller, one for the floppy drive controller, one for the video card, one for the printer parallel port, one for the mouse. Oh no, you ran out of slots, and you haven't even gotten to installing a network card or expansion RAM yet! You could try to do some consolidation by buying so-called multifunction cards, but still, the expansion card crunch was real. But why go to all this trouble to upgrade your IBM PC to something roughly equivalent to an IBM PC AT? Why not just buy an IBM PC AT in the first place? Who would be interested in this niche upgrade product?

Bug

Patched Windows Bug Was Actually a Dangerous Wormable Code-Execution Vulnerability (arstechnica.com) 20

Ars Technica reports on a dangerously "wormable" Windows vulnerability that allowed attackers to execute malicious code with no authentication required — a vulnerability that was present "in a much broader range of network protocols, giving attackers more flexibility than they had when exploiting the older vulnerability." Microsoft fixed CVE-2022-37958 in September during its monthly Patch Tuesday rollout of security fixes. At the time, however, Microsoft researchers believed the vulnerability allowed only the disclosure of potentially sensitive information. As such, Microsoft gave the vulnerability a designation of "important." In the routine course of analyzing vulnerabilities after they're patched, IBM security researcher Valentina Palmiotti discovered it allowed for remote code execution in much the way EternalBlue did [the flaw used to detonate WannaCry]. Last week, Microsoft revised the designation to critical and gave it a severity rating of 8.1, the same given to EternalBlue....

One potentially mitigating factor is that a patch for CVE-2022-37958 has been available for three months. EternalBlue, by contrast, was initially exploited by the NSA as a zero-day. The NSA's highly weaponized exploit was then released into the wild by a mysterious group calling itself Shadow Brokers. The leak, one of the worst in the history of the NSA, gave hackers around the world access to a potent nation-state-grade exploit. Palmiotti said there's reason for optimism but also for risk: "While EternalBlue was an 0-Day, luckily this is an N-Day with a 3 month patching lead time," said Palmiotti.

There's still some risk, Palmiotti tells Ars Technica. "As we've seen with other major vulnerabilities over the years, such as MS17-010 which was exploited with EternalBlue, some organizations have been slow deploying patches for several months or lack an accurate inventory of systems exposed to the internet and miss patching systems altogether."

Thanks to Slashdot reader joshuark for sharing the article.
Technology

Who Really Invented the Thumb Drive? (ieee.org) 134

IEEE Spectrum: In 2000, at a trade fair in Germany, an obscure Singapore company called Trek 2000 unveiled a solid-state memory chip encased in plastic and attached to a Universal Serial Bus (USB) connector. The gadget, roughly the size of a pack of chewing gum, held 8 megabytes of data and required no external power source, drawing power directly from a computer when connected. It was called the ThumbDrive. That device, now known by a variety of names -- including memory stick, USB stick, flash drive, as well as thumb drive -- changed the way computer files are stored and transferred. Today it is familiar worldwide. The thumb drive was an instant hit, garnering hundreds of orders for samples within hours. Later that year, Trek went public on the Singapore stock exchange, and in four months -- from April through July 2000 -- it manufactured and sold more than 100,000 ThumbDrives under its own label.

Before the invention of the thumb drive, computer users stored and transported their files using floppy disks. Developed by IBM in the 1960s, first 8-inch and later 5 1/4-inch and 3 1/2-inch floppy disks replaced cassette tapes as the most practical portable storage media. Floppy disks were limited by their relatively small storage capacity -- even double-sided, double-density disks could store only 1.44 MB of data. During the 1990s, as the size of files and software increased, computer companies searched for alternatives. Personal computers in the late 1980s began incorporating CD-ROM drives, but initially these could read only from prerecorded disks and could not store user-generated data. The Iomega Zip Drive, called a "superfloppy" drive and introduced in 1994, could store up to 750 MB of data and was writable, but it never gained widespread popularity, partly due to competition from cheaper and higher-capacity hard drives.

Computer users badly needed a cheap, high-capacity, reliable, portable storage device. The thumb drive was all that -- and more. It was small enough to slip in a front pocket or hang from a keychain, and durable enough to be rattled around in a drawer or tote without damage. With all these advantages, it effectively ended the era of the floppy disk. But Trek 2000 hardly became a household name. And the inventor of the thumb drive and Trek's CEO, Henn Tan, did not become as famous as other hardware pioneers like Robert Noyce, Douglas Engelbart, or Steve Jobs. Even in his home of Singapore, few people know of Tan or Trek. Why aren't they more famous? After all, mainstream companies including IBM, TEAC, Toshiba, and, ultimately, Verbatim licensed Trek's technology for their own memory stick devices. And a host of other companies just copied Tan without permission or acknowledgment.

IBM

IBM To Create 24-Core Power Chip So Customers Can Exploit Oracle Database License (theregister.com) 70

IBM has quietly announced it's planning a 24-core Power 10 processor, seemingly to make one of its servers capable of running Oracle's database in a cost-effective fashion. From a report: A hardware announcement dated December 13 revealed the chip in the following "statement of general direction" about Big Blue's Power S1014 technology-based server: "IBM intends to announce a high-density 24-core processor for the IBM Power S1014 system (MTM 9105-41B) to address application environments utilizing an Oracle Database with the Standard Edition 2 (SE2) licensing model. It intends to combine a robust compute throughput with the superior reliability and availability features of the IBM Power platform while complying with Oracle Database SE2 licensing guidelines."
Unix

OSnews Decries 'The Mass Extinction of Unix Workstations' (osnews.com) 284

Anyone remember the high-end commercial UNIX workstations from a few decades ago — like from companies like IBM, DEC, SGI, and Sun Microsystems?

Today OSnews looked back — but also explored what happens when you try to buy one today> : As x86 became ever more powerful and versatile, and with the rise of Linux as a capable UNIX replacement and the adoption of the NT-based versions of Windows, the days of the UNIX workstations were numbered. A few years into the new millennium, virtually all traditional UNIX vendors had ended production of their workstations and in some cases even their associated architectures, with a lacklustre collective effort to move over to Intel's Itanium — which didn't exactly go anywhere and is now nothing more than a sour footnote in computing history.

Approaching roughly 2010, all the UNIX workstations had disappeared.... and by now, they're all pretty much dead (save for Solaris). Users and industries moved on to x86 on the hardware side, and Linux, Windows, and in some cases, Mac OS X on the software side.... Over the past few years, I have come to learn that If you want to get into buying, using, and learning from UNIX workstations today, you'll run into various problems which can roughly be filed into three main categories: hardware availability, operating system availability, and third party software availability.

Their article details their own attempts to buy one over the years, ultimately concluding the experience "left me bitter and frustrated that so much knowledge — in the form of documentation, software, tutorials, drivers, and so on — is disappearing before our very eyes." Shortsightedness and disinterest in their own heritage by corporations, big and small, is destroying entire swaths of software, and as more years pass by, it will get ever harder to get any of these things back up and running.... As for all the third-party software — well, I'm afraid it's too late for that already. Chasing down the rightsholders is already an incredibly difficult task, and even if you do find them, they are probably not interested in helping you, and even if by some miracle they are, they most likely no longer even have the ability to generate the required licenses or release versions with the licensing ripped out. Stuff like Pro/ENGINEER and SoftWindows for UNIX are most likely gone forever....

Software is dying off at an alarming rate, and I fear there's no turning the tide of this mass extinction.

The article also wonders why companies like HPE don't just "dump some ISO files" onto an FTP server, along with patch depots and documentation. "This stuff has no commercial value, they're not losing any sales, and it will barely affect their bottom line.
Science

Physicists Use Google's Quantum Computer to Create Holographic Wormhole Between Black Holes (quantamagazine.org) 55

"In an experiment that ticks most of the mystery boxes in modern physics, a group of researchers announced Wednesday that they had simulated a pair of black holes in a quantum computer," reports the New York Times [alternate URL here. But in addition, the researchers also sent a message between their two black holes, the Times reports, "through a shortcut in space-time called a wormhole.

"Physicists described the achievement as another small step in the effort to understand the relation between gravity, which shapes the universe, and quantum mechanics, which governs the subatomic realm of particles....

Quanta magazine reports: The wormhole emerged like a hologram out of quantum bits of information, or "qubits," stored in tiny superconducting circuits. By manipulating the qubits, the physicists then sent information through the wormhole, they reported Wednesday in the journal Nature. The team, led by Maria Spiropulu of the California Institute of Technology, implemented the novel "wormhole teleportation protocol" using Google's quantum computer, a device called Sycamore housed at Google Quantum AI in Santa Barbara, California. With this first-of-its-kind "quantum gravity experiment on a chip," as Spiropulu described it, she and her team beat a competing group of physicists who aim to do wormhole teleportation with IBM and Quantinuum's quantum computers.

When Spiropulu saw the key signature indicating that qubits were passing through the wormhole, she said, "I was shaken."

The experiment can be seen as evidence for the holographic principle, a sweeping hypothesis about how the two pillars of fundamental physics, quantum mechanics and general relativity, fit together.... The holographic principle, ascendant since the 1990s, posits a mathematical equivalence or "duality" between the two frameworks. It says the bendy space-time continuum described by general relativity is really a quantum system of particles in disguise. Space-time and gravity emerge from quantum effects much as a 3D hologram projects out of a 2D pattern. Indeed, the new experiment confirms that quantum effects, of the type that we can control in a quantum computer, can give rise to a phenomenon that we expect to see in relativity — a wormhole....

To be clear, unlike an ordinary hologram, the wormhole isn't something we can see. While it can be considered "a filament of real space-time," according to co-author Daniel Jafferis of Harvard University, lead developer of the wormhole teleportation protocol, it's not part of the same reality that we and the Sycamore computer inhabit. The holographic principle says that the two realities — the one with the wormhole and the one with the qubits — are alternate versions of the same physics, but how to conceptualize this kind of duality remains mysterious. Opinions will differ about the fundamental implications of the result. Crucially, the holographic wormhole in the experiment consists of a different kind of space-time than the space-time of our own universe. It's debatable whether the experiment furthers the hypothesis that the space-time we inhabit is also holographic, patterned by quantum bits.

"I think it is true that gravity in our universe is emergent from some quantum [bits] in the same way that this little baby one-dimensional wormhole is emergent" from the Sycamore chip, Jafferis said. "Of course we don't know that for sure. We're trying to understand it."

Here's how principal investigator Spiropulu summarizes their experiment. "We found a quantum system that exhibits key properties of a gravitational wormhole yet is sufficiently small to implement on today's quantum hardware."
Hardware

PCI Standards Group Deflects, Assigns Blame for Melting GPU Power Connectors (arstechnica.com) 130

An anonymous reader shares a report: Nvidia's new RTX 4090 and 4080 GPUs both use a new connector called 12VHPWR to deliver power as a way to satisfy ever-more power-hungry graphics cards without needing to set aside the physical space required for three or four 8-pin power connectors. But that power connector and its specifications weren't created by Nvidia alone -- to ensure interoperability, the spec was developed jointly by the PCI Express Special Interest Group (PCI-SIG), a body that includes Nvidia, AMD, Intel, Arm, IBM, Qualcomm, and others.

But the overheating and melting issues experienced by some RTX 4090 owners recently have apparently prompted the PCI-SIG to clarify exactly which parts of the spec it is and is not responsible for. In a statement reported by Tom's Hardware, the group sent its members a reminder that they, not the PCI-SIG, were responsible for safety testing products using connector specs like 12VHPWR. "Members are reminded that PCI-SIG specifications provide necessary technical information for interoperability and do not attempt to address proper design, manufacturing methods, materials, safety testing, safety tolerances, or workmanship," the statement reads. "When implementing a PCI-SIG specification, Members are responsible for the design, manufacturing, and testing, including safety testing, of their products."

IBM

IBM and Maersk Abandon Ship on TradeLens Logistics Blockchain (coindesk.com) 28

Maersk and IBM will wind down their shipping blockchain TradeLens by early 2023, ending the pair's five-year project to improve global trade by connecting supply chains on a permissioned blockchain. From a report: TradeLens emerged during the "enterprise blockchain" era of 2018 as a high-flying effort to make inter-corporate trade more efficient. Open to shipping and freight operators, its members could validate the transaction of goods as recorded on a transparent digital ledger.

The idea was to save its member-shipping companies money by connecting their world. But the network was only as strong as its participants; despite some early wins, TradeLens ultimately failed to catch on with a critical mass of its target industry. "TradeLens has not reached the level of commercial viability necessary to continue work and meet the financial expectations as an independent business," Maersk Head of Business Platforms Rotem Hershko said in a statement.

Software

Frederick P. Brooks Jr., Computer Design Innovator, Dies at 91 16

Frederick P. Brooks Jr., whose innovative work in computer design and software engineering helped shape the field of computer science, died on Thursday at his home in Chapel Hill, N.C. He was 91. His death was confirmed by his son, Roger, who said Dr. Brooks had been in declining health since having a stroke two years ago. The New York Times reports: Dr. Brooks had a wide-ranging career that included creating the computer science department at the University of North Carolina and leading influential research in computer graphics and virtual reality. But he is best known for being one of the technical leaders of IBM's 360 computer project in the 1960s. At a time when smaller rivals like Burroughs, Univac and NCR were making inroads, it was a hugely ambitious undertaking. Fortune magazine, in an article with the headline "IBM's $5,000,000,000 Gamble," described it as a "bet the company" venture.

Until the 360, each model of computer had its own bespoke hardware design. That required engineers to overhaul their software programs to run on every new machine that was introduced. But IBM promised to eliminate that costly, repetitive labor with an approach championed by Dr. Brooks, a young engineering star at the company, and a few colleagues. In April 1964, IBM announced the 360 as a family of six compatible computers. Programs written for one 360 model could run on the others, without the need to rewrite software, as customers moved from smaller to larger computers. The shared design across several machines was described in a paper, written by Dr. Brooks and his colleagues Gene Amdahl and Gerrit Blaauw, titled "Architecture of the IBM System/360." "That was a breakthrough in computer architecture that Fred Brooks led," Richard Sites, a computer designer who studied under Dr. Brooks, said in an interview.

But there was a problem. The software needed to deliver on the IBM promise of compatibility across machines and the capability to run multiple programs at once was not ready, as it proved to be a far more daunting challenge than anticipated. Operating system software is often described as the command and control system of a computer. The OS/360 was a forerunner of Microsoft's Windows, Apple's iOS and Google's Android. At the time IBM made the 360 announcement, Dr. Brooks was just 33 and headed for academia. He had agreed to return to North Carolina, where he grew up, and start a computer science department at Chapel Hill. But Thomas Watson Jr., the president of IBM, asked him to stay on for another year to tackle the company's software troubles. Dr. Brooks agreed, and eventually the OS/360 problems were sorted out. The 360 project turned out to be an enormous success, cementing the company's dominance of the computer market into the 1980s.
"Fred Brooks was a brilliant scientist who changed computing," Arvind Krishna, IBM's chief executive and himself a computer scientist, said in a statement. "We are indebted to him for his pioneering contributions to the industry."

Dr. Brooks published a book in 1975 titled, "The Mythical Man-Month: Essays on Software Engineering." It was "a quirky classic, selling briskly year after year and routinely cited as gospel by computer scientists," reports the Times.
IBM

IBM Sues Micro Focus, Claims It Copied Big Blue Mainframe Software (theregister.com) 43

IBM has filed a lawsuit against Micro Focus, alleging the enterprise software company copied and reverse-engineered its CICS mainframe service to develop a rival product, the Micro Focus Enterprise Server. From a report: Big Blue has brought the case in the US District Court in New York, citing violation of copyright law and claiming that Micro Focus was in "blatant breach" of its contractual obligations with IBM. In a strongly worded complaint, the company accused UK-based Micro Focus of "brazen theft" of IBM software and said the suit was filed to "protect [its] valuable intellectual property." IBM is seeking compensation as well as an injunction against Micro Focus that would prohibit the company from distributing the products Big Blue labels as "derivative works" it claims are based upon IBM's own computer software.
Businesses

Is Quantum Computing Moving from Theoretical to Startups? (msn.com) 38

The Boston Globe reports that "More money is starting to flow into the nascent field of quantum computing in Boston, turning academic research at MIT and Harvard labs into startups."

In September, Northeastern University announced it will build a $10 million lab at its Burlington campus to explore applications for quantum technology, and to train students to work with it. And companies based in other countries are setting up outposts here to hire quantum-savvy techies....

"It's still pretty early" for quantum computing, says Russ Wilcox, a partner at the venture capital firm Pillar. "But a number of companies are starting to experiment to learn how to make use of it. The key factor is that the field is progressing at an exponential rate." In 2018, his firm made an early investment in Zapata Computing, a Boston startup building software for quantum computers and selling services — including ways to analyze the new cybersecurity risks that a powerful new class of computers could introduce....

In the current fiscal year, the federal government budgeted about $900 million to advance the field of quantum information science, which includes quantum computing....

[S]everal local venture capital firms are getting comfortable with placing bets on the quantum computing sector. Glasswing's Rudina Seseri says that her firm is "seeing momentum pick up," although the sector is "still in the warm-up phase, not yet in the first inning." But some of the technology being developed by startups, she says, "is so meaningful that if they get the technology to work at scale, they will be incredibly valuable."

That said, much of the revenue available to these companies today comes from researchers in academic and corporate labs trying to understand the potential of quantum computers. Sam Liss, an executive director in Harvard's Office of Technology Development, thinks that "the large commercial opportunities for quantum are still a long way off." The OTD helps attract corporate funding to Harvard research labs, and also helps to license technologies created in those labs to the private sector. "Technologies have a way of getting oversold and overhyped," Liss says. "We all recognize that this is going to take some time."

Large companies like Amazon, Google, and IBM are trying to move the field forward, and startups are beginning to demonstrate their new approaches. In the startup realm, Liss says, we're seeing enough new companies being formed and attracting funding "to support a thesis that this will be a big thing."

Supercomputing

IBM Unveils Its 433 Qubit Osprey Quantum Computer (techcrunch.com) 29

An anonymous reader quotes a report from TechCrunch: IBM wants to scale up its quantum computers to over 4,000 qubits by 2025 -- but we're not quite there yet. For now, we have to make do with significantly smaller systems and today, IBM announced the launch of its Osprey quantum processor, which features 433 qubits, up from the 127 qubits of its 2021 Eagle processor. And with that, the slow but steady march toward a quantum processor with real-world applications continues.

IBM's quantum roadmap includes two additional stages -- the 1,121-qubit Condor and 1,386-qubit Flamingo processors in 2023 and 2024 -- before it plans to hit the 4,000-qubit stage with its Kookaburra processor in 2025. So far, the company has generally been able to make this roadmap work, but the number of qubits in a quantum processor is obviously only one part of a very large and complex puzzle, with longer coherence times and reduced noise being just as important.

The company also today detailed (Link: YouTube) its Quantum System Two -- basically IBM's quantum mainframe -- which will be able to house multiple quantum processors and integrate them into a single system with high-speed communication links. The idea here is to launch this system by the end of 2023.
"The new 433 qubit 'Osprey' processor brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems," said Dario Gil, senior vice president, IBM and director of Research. "We are continuously scaling up and advancing our quantum technology across hardware, software and classical integration to meet the biggest challenges of our time, in conjunction with our partners and clients worldwide. This work will prove foundational for the coming era of quantum-centric supercomputing."

Further reading: IBM Held Talks With Biden Administration on Quantum Controls
IBM

IBM Held Talks With Biden Administration on Quantum Controls (bloomberg.com) 17

IBM has engaged in talks with the Biden administration on potential export controls for quantum computers as the company continues investing in the emerging technology. From a report: IBM recommended that any regulations, if developed, cover potentially problematic uses of quantum computing rather than limiting the technology based simply on processing power, said Dario Gil, head of IBM Research. Quantum technology will likely be subject to constraints like export controls, Gil said. "We will continue to be an active participant in that dialogue," he said.

Quantum computing is an experimental field with the potential to accelerate processing power and upend current cybersecurity standards. The Biden administration is exploring the possibility of new export controls that would limit China's access to quantum along with other powerful emerging technologies, Bloomberg News reported last month. IBM has installed quantum infrastructure in countries like Germany and Japan, but not China, Gil said. Big Blue has invested millions in the field, and is unveiling a new quantum processor this week that is more than three times more powerful, measured by qubits, than its version announced last year.

Open Source

Google Announces GUAC Open-Source Project On Software Supply Chains (therecord.media) 2

Google unveiled a new open source security project on Thursday centered around software supply chain management. The Record reports: Given the acronym GUAC -- which stands for Graph for Understanding Artifact Composition -- the project is focused on creating sets of data about a software's build, security and dependency. Google worked with Purdue University, Citibank and supply chain security company Kusari on GUAC, a free tool built to bring together many different sources of software security metadata. Google has also assembled a group of technical advisory members to help with the project -- including IBM, Intel, Anchore and more.

Google's Brandon Lum, Mihai Maruseac, Isaac Hepworth pitched the effort as one way to help address the explosion in software supply chain attacks -- most notably the widespread Log4j vulnerability that is still leaving organizations across the world exposed to attacks. "GUAC addresses a need created by the burgeoning efforts across the ecosystem to generate software build, security, and dependency metadata," they wrote in a blog post. "GUAC is meant to democratize the availability of this security information by making it freely accessible and useful for every organization, not just those with enterprise-scale security and IT funding."

Google shared a proof of concept of the project, which allows users to search data sets of software metadata. The three explained that GUAC effectively aggregates software security metadata into a database and makes it searchable. They used the example of a CISO or compliance officer that needs to understand the "blast radius" of a vulnerability. GUAC would allow them to "trace the relationship between a component and everything else in the portfolio." Google says the tool will allow anyone to figure out the most used critical components in their software supply chain ecosystem, the security weak points and any risky dependencies. As the project evolves, Maruseac, Lum and Hepworth said the next part of the work will center around scaling the project and adding new kinds of documents that can be submitted and ingested by the system.

Amiga

Ask Slashdot: What Was Your First Computer? 523

Long-time Slashdot reader destinyland writes: Today GitHub's official Twitter account asked the ultimate geek-friendly question. "You never forget your first computer. What was yours?"

And within 10 hours they'd gotten 2,700 responses.

Commodore 64, TRS-80, Atari 800, Compaq Presario... People posted names you haven't heard in years, like they were sharing memories of old friends. Gateway 2000, Sony VAIO, Vic-20, Packard Bell... One person just remembered they'd had "some sort of PC that had an orange and black screen with text and QBasic. It couldn't do much more than store recipes and play text based games."

And other memories started to flow. ("Jammed on Commander Keen & Island of Dr. Brain..." "Dammit that Doom game was amazing, can't forget Oregon Trail...")

Sharp PC-4500, Toshiba T3200, Timex Sinclair 1000, NEC PC-8801. Another's first computer was "A really really old HP laptop that has a broken battery!"

My first computer was an IBM PS/2. It had a 2400 baud internal modem. Though in those long-ago days before local internet services, it was really only good for dialing up BBS's. I played chess against a program on a floppy disk that I got from a guy from work.

Can you still remember yours? Share your best memories in the comments.

What was your first computer?
China

Star American Professor Masterminded a Surveillance Machine For Chinese Big Tech (thedailybeast.com) 26

An anonymous reader quotes a report from The Daily Beast: A star University of Maryland (UMD) professor built a machine-learning software "useful for surveillance" as part of a six-figure research grant from Chinese tech giant Alibaba, raising concerns that an American public university directly contributed to China's surveillance state. Alibaba provided $125,000 in funding to a research team led by Dinesh Manocha, a professor of computer science at UMD College Park, to develop an urban surveillance software that can "classify the personality of each pedestrian and identify other biometric features," according to research grant documents obtained via public records request. "These capabilities will be used to predict the behavior of each pedestrian and are useful for surveillance," the document read.

Manocha is a decorated scholar in the AI and robotics field who has earned awards and accolades from Google, IBM, and many others. His star status brings rewards: Maryland taxpayers paid $355,000 in salaries to the professor in 2021, according to government watchdog Open the Books. The U.S. military also provides lavish funding for the professor's research, signing a $68 million agreement with Manocha's lab to research military applications of AI technologies. But Maryland taxpayers and the U.S. military are not the only ones funding Manocha's research. In January 2018, the University of Maryland and Alibaba signed an 18-month research contract funding Manocha's research team. In the grant document obtained by The Daily Beast, Manocha's team pledged to "work closely with Alibaba researchers" to develop an urban surveillance software that can identify pedestrians based on their unique gait signatures. The algorithm would then use the gait signatures to classify pedestrians as "aggressive," "shy," "impulsive," and other personalities. The grant required UMD researchers to test the algorithm on videos provided by Alibaba and present their findings in person at Alibaba labs in China. The scholars also had to provide the C++ codebase for the software and the raw dataset as deliverables to Alibaba. The software's "clear implication is to proactively predict demonstrations and protests so that they might be quelled," Fedasiuk told The Daily Beast. "Given what we know now about China's architecture of repression in Xinjiang and other regions, it is clear Dr. Manocha should not have pitched this project, and administrators at UMD should not have signed off on it."

It's not just Alibaba that was interested in the professor's expertise. In January 2019 -- back when the Alibaba grant was still active -- Manocha secured a taxpayer-funded, $321,000 Defense Department grant for his research team. The two grants funded very similar research projects. The Alibaba award was titled "large-scale behavioral learning for dense crowds." Meanwhile, the DoD grant funded research into "efficient computational models for simulating large-scale heterogeneous crowds." Unsurprisingly, the research outputs produced by the two grants had significant overlap. Between 2019 and 2021, Manocha published multiple articles in the AI and machine-learning field that cited both the Alibaba and DoD grant. There is no evidence that Manocha broke the law by double-dipping from U.S. and Chinese funding sources to fund similar research projects. Nevertheless, the case still raises "serious questions about ethics in machine learning research," Fedasiuk said.

Slashdot Top Deals