China

Chinese Billionaire Jack Ma Says the US Wasted Trillions on Warfare Instead of Investing in Infrastructure (cnbc.com) 594

Alibaba founder Jack Ma fired a shot at the United States in an interview at the World Economic Forum in Davos, Switzerland. An anonymous reader shares a report: Ma was asked by CNBC's Andrew Ross Sorkin about the U.S. economy in relation to China, since President-elect Donald Trump has been talking about imposing new tariffs on Chinese imports. Ma says blaming China for any economic issues in the U.S. is misguided. If America is looking to blame anyone, Ma said, it should blame itself. "It's not that other countries steal jobs from you guys," Ma said. "It's your strategy. Distribute the money and things in a proper way." He said the U.S. has wasted over $14 trillion in fighting wars over the past 30 years rather than investing in infrastructure at home.

To be sure, Ma is not the only critic of the costly U.S. policies of waging war against terrorism and other enemies outside the homeland. Still, Ma said this was the reason America's economic growth had weakened, not China's supposed theft of jobs. In fact, Ma called outsourcing a "wonderful" and "perfect" strategy. "The American multinational companies made millions and millions of dollars from globalization," Ma said. "The past 30 years, IBM, Cisco, Microsoft, they've made tens of millions -- the profits they've made are much more than the four Chinese banks put together. ... But where did the money go?"

Amiga

Was Commodore's Amiga 'A Computer Ahead of Its Time'? (gizmodo.com.au) 418

Long-time Slashdot reader Mike Bouma quotes Gizmodo: Despite being ahead of its time when it was unveiled in 1985, the Commodore Amiga didn't survive past 1996. The machine, which went up against with the likes of the IBM PC and the Macintosh, offered far superior hardware than its competitors. But it just wasn't enough, as this video from Ahoy's Stuart Brown explains. While the Amiga had other 16-bit computers beat on technology, it didn't really have anything compelling to do with that hardware. "With 4096 colours, 4 channels of digital audio, and preemptive multitasking, [the Amiga] was capable of incredible things for the time...."

[U]nfortunately, internal struggles within Commodore would signal the beginning of the end.

I'll always remember Joel Hodgson's Amiga joke on a 1991 episode of Mystery Science Theatre 3000. But in 2015 Geek.com reported on an Amiga which had been running a school's heating system for the last 30 years. A local high school student had originally set it up, and "he's the only one who knows how to fix software glitches. Luckily, he still lives in the area."

Leave your own thoughts in the comments. Does anyone else have their own stories about Commodore's Amiga? And was the Amiga a computer ahead of its time?
Java

OpenJDK Bug Report Complains Source Code 'Has Too Many Swear Words' (java.net) 281

Thursday a bug report complained that the source code for OpenJDK, the free and open-source implementation of Java, "has too many swear words." An anonymous reader writes: "There are many instances of swear words inside OpenJDK jdk/jdk source, scattered all over the place," reads the bug report. "As OpenJDK is used in a professional context, it seems inappropriate to leave these 12 instances in there, so here's a changeset to remove them."
IBM software developer (and OpenJDK team member and contributor) Adam Farley responded that "after discussion with the community, three determinations were reached":
  • "Damn" and "Crap" are not swear words.
  • Three of the four f-bombs are located in jszip.js, which should be corrected upstream (will follow up).
  • The f-bomb in BitArray.java, as well as the rude typo in SoftChannel.java, *are* swear words and should be removed to resolve this work item.

He promised a new webrev would be uploaded to reflect these determinations, and the bug has been marked as "resolved."


Businesses

After 23 Years, IBM Sells Off Lotus Notes (techcrunch.com) 105

"IBM has agreed to sell select software products to HCL Technologies," writes Slashdot reader virtig01. "Included among these is everyone's favorite email and calendaring tool, Lotus Notes and Domino." TechCrunch reports: IBM paid $3.5 billion for Lotus back in the day. The big pieces here are Lotus Notes, Domino and Portal. These were a big part of IBM's enterprise business for a long time, but last year Big Blue began to pull away, selling the development part to HCL, while maintaining control of sales and marketing. This announcement marks the end of the line for IBM involvement. With the development of the platform out of its control, and in need of cash after spending $34 billion for Red Hat, perhaps IBM simply decided it no longer made sense to keep any part of this in-house. As for HCL, it sees an opportunity to continue to build the Notes/Domino business. "The large-scale deployments of these products provide us with a great opportunity to reach and serve thousands of global enterprises across a wide range of industries and markets," C Vijayakumar, president and CEO at HCL Technologies, said in a statement announcing the deal.
Software

DeepMind Produces a General-Purpose Game-Playing System, Capable of Mastering Games Like Chess and Go Without Human Help (ieee.org) 124

DeepMind has created a system that can quickly master any game in the class that includes chess, Go, and Shogi, and do so without human guidance. "The system, called AlphaZero, began its life last year by beating a DeepMind system that had been specialized just for Go," reports IEEE Spectrum. "That earlier system had itself made history by beating one of the world's best Go players, but it needed human help to get through a months-long course of improvement. AlphaZero trained itself -- in just 3 days." From the report: The research, published today in the journal Science, was performed by a team led by DeepMind's David Silver. The paper was accompanied by a commentary by Murray Campbell, an AI researcher at the IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y. AlphaZero can crack any game that provides all the information that's relevant to decision-making; the new generation of games to which Campbell alludes do not. Poker furnishes a good example of such games of "imperfect" information: Players can hold their cards close to their chests. Other examples include many multiplayer games, such as StarCraft II, Dota, and Minecraft. But they may not pose a worthy challenge for long.

DeepMind developed the self-training method, called deep reinforcement learning, specifically to attack Go. Today's announcement that they've generalized it to other games means they were able to find tricks to preserve its playing strength after giving up certain advantages peculiar to playing Go. The biggest such advantage was the symmetry of the Go board, which allowed the specialized machine to calculate more possibilities by treating many of them as mirror images. The researchers have so far unleashed their creation only on Go, chess and Shogi, a Japanese form of chess. Go and Shogi are astronomically complex, and that's why both games long resisted the "brute-force" algorithms that the IBM team used against Kasparov two decades ago.

Security

Researchers Discover SplitSpectre, a New Spectre-like CPU Attack (zdnet.com) 48

An anonymous reader writes from a report via ZDNet: Three academics from Northeastern University and three researchers from IBM Research have discovered a new variation of the Spectre CPU vulnerability that can be exploited via browser-based code. The vulnerability, which researchers codenamed SplitSpectre, is a variation of the original Spectre v1 vulnerability discovered last year and which became public in January 2018. The difference in SplitSpectre is not in what parts of a CPU's microarchitecture the flaw targets, but how the attack is carried out. Researchers say a SplitSpectre attack is both faster and easier to execute, improving an attacker's ability to recover code from targeted CPUs. The research team says they were successfully able to carry out a SplitSpectre attack against Intel Haswell and Skylake CPUs, and AMD Ryzen processors, via SpiderMonkey 52.7.4, Firefox's JavaScript engine. The good news is that existing Spectre mitigations would thwart the SplitSpectre attacks.
AI

IBM Aims To Meld AI With Human Resources With Watson Suite (zdnet.com) 74

PolygamousRanchKid shares a report from ZDNet (with some commentary): IBM has launched a unit designed for human resources to better find talent and recruit using artificial intelligence. The company is wrapping its latest HR effort, dubbed IBM Talent & Transformation, which includes select Watson services. According to IBM, its suite of AI tools can help HR become a growth engine to enable digital transformation. AI can be used to revamp workflow, employee engagement, recruitment and retention while providing a more diverse workforce. (I can still program Fortran; I learned it from Forman S. Acton -- does that make me diverse enough?) Big Blue's Talent & Transformation suite includes a Watson Talent Suite that rolls up behavioral science, AI and psychology and applies it to HR. (Sounds like the recipe for The Apocalypse to me.) IBM Garage, which serves as a test bed to meld HR, AI and culture, will also be available. (Garage? It sounds like the creepy CRISPR basement of a mad scientist to me.)
IBM

IBM CEO Joins Apple In Blasting Data use By Silicon Valley Firms (bloomberg.com) 63

IBM CEO Ginni Rometty joined a growing chorus of tech executives lambasting web platforms, like Google and Facebook, over their collection of user data and urged governments to target regulation at those companies. Bloomberg reports: Without naming company names, Rometty pointed to the "irresponsible handling of personal data by a few dominant consumer-facing platform companies" as the cause of a "trust crisis" between users and tech companies, according to an advanced copy of her remarks. Rometty's comments, given at a Brussels event with top EU officials Monday, echoed recent statements by Apple CEO Tim Cook, who in October slammed Silicon Valley rivals over their use of data, equating their services to "surveillance."

IBM meanwhile has seen revenue decline since Rometty took the CEO role in 2012, largely due to falling sales in existing hardware, software and services offerings. She has since been trying to steer IBM toward more modern businesses, such as the cloud, artificial intelligence, and security software. Seeking to separate IBM -- which operates primarily at a business-to-business level -- from the troubled tech companies, Rometty said governments should target regulation at consumer-facing web platforms, like social media firms and search engines. In particular, Rometty pushed for more measures around the transparency of artificial intelligence as well as controversial rules around platform liability.

IBM

IBM: Chip Making is Hitting Its Limits, But Our Techniques Could Solve That (zdnet.com) 50

IBM has devised materials and processes that could help improve the efficiency of chip production at the 7nm node and beyond. From a report: The company's researchers are working on challenges in the emerging field of 'area-selective deposition', a technology that could help overcome limitations on lithographic techniques to create patterns on silicon in 7nm processes. Semi Engineering has a neat account of lithographic patterning and why at 7nm there's growing interest in area-selective deposition. Techniques such as 'multiple patterning' helped ensure integrated circuits kept scaling, but as chips have shrunk from 28nm to 7nm processes, chipmakers have needed to process more layers with ever-smaller features that need more precise placement on patterns. Those features need to align between layers. When they don't, it leads to 'edge placement error' (EPE), a challenge that Intel lithography expert Yan Borodovsky believed lithography couldn't solve and which would ultimately impede Moore's Law.
Science

Is Quantum Computing Impossible? (ieee.org) 222

"Quantum computing is complex and it's not all it's cracked up to be," writes Slashdot reader nickwinlund77, pointing to this new article from IEEE Spectrum arguing it's "not in our foreseeable future": Having spent decades conducting research in quantum and condensed-matter physics, I've developed my very pessimistic view. It's based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.... Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2**1,000, which is to say about 10**300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe. To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe. At this point in a description of a possible future technology, a hardheaded engineer loses interest....

[I]t's absolutely unimaginable how to keep errors under control for the 10300 continuous parameters that must be processed by a useful quantum computer. Yet quantum-computing theorists have succeeded in convincing the general public that this is feasible.... Even without considering these impossibly large numbers, it's sobering that no one has yet figured out how to combine many physical qubits into a smaller number of logical qubits that can compute something useful. And it's not like this hasn't long been a key goal.... On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work...

I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. That's because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs. What's more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown older and less zealous, while the younger generation seeks something completely new and more likely to succeed.

He advises quantum computing researchers to follow the advice of IBM physicist Rolf Landauer. Decades ago Landauer warned quantum computing's proponents that they needed a disclaimer in all of their publications.

"This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work."
Hardware

Bill Godbout, Early S-100 Bus Pioneer, Perished In the Camp Wildfire (vcfed.org) 124

evanak writes: Bill Godbout was one of the earliest and most influential supports of the S-100 bus in the mid-1970s. He passed away last week due to the Camp wildfire in Concow, California, according to a Vintage Computer Federation blog post. More than 50 other people also died in the fires, but chances are Mr. Godbout was the only one with a license to fly blimps. "Godbout was born October 2, 1939," the blog post reads. "He talked about his introduction to computing in an interview with InfoWorld magazine for their February 18, 1980 issue. 'My first job out of college was with IBM. I served a big-system apprenticeship there, but I think the thing that really triggered [my interest] was the introduction of the 8008 by Intel,' he said. 'I was fascinated that you could have that kind of capability in a little 18-pin package.'"

Godbout's family has set up a GoFundMe campaign to support their needs in this difficult time.
China

US Overtakes China in Top Supercomputer List (bbc.com) 74

China has been pushed into third place on a list of the world's most powerful supercomputers. From a report: The latest list by Top 500, published twice a year, puts two US machines -- Summit and Sierra -- in the top two places. The US has five entries in the top 10, with other entries from Switzerland, Germany and Japan. However, overall China has 227 machines in the top 500, while the US has 109. Summit can process 200,000 trillion calculations per second. Both Summit and Sierra were built by the tech giant IBM. China's Sunway TaihuLight supercomputer, which this time last year was the world's most powerful machine, is now ranked at number three, while the country also has the fourth spot in the list.
IBM

What Does It Take To Keep a Classic IBM 1401 Mainframe Alive? (ieee.org) 60

"Think your vintage computer hardware is old?" writes long-time Slashdot reader corrosive_nf. "Ken Shirriff, Robert Garne, and their associates probably have you beat.

"The IBM 1401 was introduced in 1959, and these guys are keeping one alive in a computer museum... [T]he volunteers have to go digging through historical archives and do some detective work to figure out solutions to pretty much anything!" Many things that we take for granted are done very differently in old computers. For instance, the IBM 1401 uses 6-bit characters, not bytes. It used decimal memory addressing, not binary. It's also interesting how much people could accomplish with limited resources, running a Fortran compiler on the 1401 with just 8K of memory. Finally, working on the 1401 has given them a deeper understanding of how computers really work. It's not a black box; you can see the individual transistors that are performing operations and each ferrite core that stores a bit.
"It's a way of keeping history alive," says one of the volunteers at Silicon Valley's Computer History museum. "For museum visitors, seeing the IBM 1401 in operation gives them a feeling for what computers were like in the 1960s, the full experience of punching data onto cards and then seeing and hearing the system processing cards....

"So far, things are breaking slowly enough that we can keep up, so it's more of a challenge than an annoyance."
Open Source

'Open Source Creators: Red Hat Got $34 Billion and You Got $0. Here's Why.' (tidelift.com) 236

Donald Fischer, who served as a product manager for Red Hat Enterprise Linux during its creation and early years of growth, writes: Red Hat saw, earlier than most, that the ascendance of open source made the need to pay for code go away, but the need for support and maintenance grew larger than ever. Thus Red Hat was never in the business of selling software, rather it was in the business of addressing the practical challenges that have always come along for the ride with software. [...] As an open source developer, you created that software. You can keep your package secure, legally documented, and maintained; who could possibly do it better? So why does Red Hat make the fat profits, and not you? Unfortunately, doing business with large companies requires a lot of bureaucratic toil. That's doubly true for organizations that require security, legal, and operational standards for every product they bring in the door. Working with these organizations requires a sales and marketing team, a customer support organization, a finance back-office, and lots of other "business stuff" in addition to technology. Red Hat has had that stuff, but you haven't.

And just like you don't have time to sell to large companies, they don't have time to buy from you alongside a thousand other open source creators, one at a time. Sure, big companies know how to install and use your software. (And good news! They already do.) But they can't afford to put each of 1100 npm packages through a procurement process that costs $20k per iteration. Red Hat solved this problem for one corner of open source by collecting 2,000+ open source projects together, adding assurances on top, and selling it as one subscription product. That worked for them, to the tune of billions. But did you get paid for your contributions?

Red Hat Software

Red Hat Enterprise Linux 7.6 Released (lwn.net) 53

Etcetera writes: Fresh on the heels of the IBM purchase announcement, Red Hat released RHEL 7.6 today. Business press release is here and full release notes are here. It's been a busy week for Red Hat, as Fedora 29 also released earlier this morning. No doubt CentOS and various other rebuilds will begin their build cycles shortly. The release offers improved security, such as support for the Trusted Platform Module (TPM) 2.0 specification for security authentication. It also provides enhanced support for the open-source nftables firewall technology.

"TPM 2.0 support has been added incrementally over recent releases of Red Hat Enterprise Linux 7, as the technology has matured," Steve Almy, principal product manager, Red Hat Enterprise Linux at Red Hat, told eWEEK. "The TPM 2.0 integration in 7.6 provides an additional level of security by tying the hands-off decryption to server hardware in addition to the network bound disk encryption (NBDE) capability, which operates across the hybrid cloud footprint from on-premise servers to public cloud deployments."
IBM

IBM To Buy Red Hat, the Top Linux Distributor, For $34 Billion (bloomberg.com) 398

International Business Machines (IBM) is acquiring software maker Red Hat in a deal valued at $34 billion, the companies said Sunday. From a report: The purchase, announced on Sunday afternoon, is the latest competitive step among large business software companies to gain an edge in the fast-growing market for Internet-style cloud computing. In June, Microsoft acquired GitHub, a major code-sharing platform for software developers, for $7.5 billion. IBM said its acquisition of Red Hat was a move to open up software development on computer clouds, in which software developers write applications that run on remote data centers. From a press release: This acquisition brings together the best-in-class hybrid cloud providers and will enable companies to securely move all business applications to the cloud. Companies today are already using multiple clouds. However, research shows that 80 percent of business workloads have yet to move to the cloud, held back by the proprietary nature of today's cloud market. This prevents portability of data and applications across multiple clouds, data security in a multi-cloud environment and consistent cloud management.

IBM and Red Hat will be strongly positioned to address this issue and accelerate hybrid multi-cloud adoption. Together, they will help clients create cloud-native business applications faster, drive greater portability and security of data and applications across multiple public and private clouds, all with consistent cloud management. In doing so, they will draw on their shared leadership in key technologies, such as Linux, containers, Kubernetes, multi-cloud management, and cloud management and automation. IBM's and Red Hat's partnership has spanned 20 years, with IBM serving as an early supporter of Linux, collaborating with Red Hat to help develop and grow enterprise-grade Linux and more recently to bring enterprise Kubernetes and hybrid cloud solutions to customers. These innovations have become core technologies within IBM's $19 billion hybrid cloud business. Between them, IBM and Red Hat have contributed more to the open source community than any other organization.

AI

IBM Researchers Teach Pac-Man To Do No Harm (fastcompany.com) 135

harrymcc writes: The better AI gets at teaching itself to perform tasks in ways beyond the skills of mere humans, the more likely it is that it may unwittingly behave in ways a human would consider unethical. To explore ways to prevent this from happening, IBM researchers taught AI to play Pac-Man without ever gobbling up the ghosts. And it did so without ever explicitly telling the software that this was the goal. Over at Fast Company, I wrote about this project and what IBM learned from conducting it.

The researchers built a piece of software that could balance the AI's ratio of self-devised, aggressive game play to human-influenced ghost avoidance, and tried different settings to see how they affected its overall approach to the game. By doing so, they found a tipping point -- the setting at which Pac-Man went from seriously chowing down on ghosts to largely avoiding them.

IBM

IBM Open Sources Mac@IBM Code (9to5mac.com) 91

PolygamousRanchKid shares a report from 9to5Mac: At the Jamf Nation User Conference, IBM has announced that it is open sourcing its Mac@IBM provisioning code. The code being open-sourced offers IT departments the ability to gather additional information about their employees during macOS setup and allows employees to customize their enrollment by selecting apps or bundles of apps to install.

Back in 2015, IBM discussed how it went from zero to 30,000 Macs in six months. In 2016, IBM said Apple products were cheaper to manage when you looked at the entire life cycle: "IBM is saving a minimum of $265 (up to $535 depending on model) per Mac compared to a PC, over a 4-year lifespan. While the upfront workstation investment is lower for PCs, the residual value for Mac is higher The program's success has improved IBM's ability to attract and retain top talent -- a key advantage in today's competitive market."

Government

White House Wants To Borrow Tech Workers From Google and Amazon, Says Report (cnet.com) 208

"According to CNET, TechCrunch and others, the Trump administration reportedly wants tech giants to make it easy for workers to take leaves of absence to help the government modernize," writes Slashdot reader kimanaw. From a report: White House officials on Monday planned to meet with tech giants including Google, Microsoft, Amazon and IBM, to discuss ways to make it easier for employees to take leaves of absence to help with government projects, according to The Washington Post. The administration reportedly hopes tech industry workers will be able to help modernize state and federal agencies and tackle challenges such as upgrading the veterans' health care system. Attracting tech talent may prove difficult for the Trump administration, which hasn't always seen eye to eye with Silicon Valley on issues such as the president's ban on travel from predominantly Muslim countries. However, White House officials believe tech workers are willing to "put politics aside." "This event on Monday is not just about our efforts, it's about our successor, and their successor after that," said one unnamed official, according to the Post. The White House didn't respond to a request for comment.
IBM

IBM Finally Proves That Quantum Systems Are Faster Than Classical Systems (engadget.com) 79

In a paper published Thursday in the journal Science, Dr. Sergey Bravyi and his team reveal that they've developed a mathematical proof which, in specific cases, illustrates the quantum algorithm's inherent computational advantages over classical. Engadget reports: "It's good to know, because results like this become parts of algorithms," Bob Sutor, vice president of IBM Q Strategy and Ecosystem, told Engadget. "They become part of decisions about how people will start to attack problems. Where will they try classical techniques? Where will they try quantum techniques? How will those interplay? How will they work back and forth together?" What's more, the proof shows that, in these cases, the quantum algorithm can solve the problem in a fixed number of steps, regardless of how many inputs are added. With a classical computer, the more inputs you add, the more steps it needs to take in order to solve. Such are the advantages of parallel processing.

"The main point of this paper is not that somehow we discover some incredibly important quantum algorithm, or some practical, interesting problem," Bravyi told Engadget. "We ask if we can separate a constant depth [between] quantum and classical algorithms. As we increase the problem size, the runtime of the quantum algorithm remains constant, but the total number of operations grows." As Bravyi points out, this new proof doesn't, in and of itself, solve any existing computational issues. Instead, "it gives us insight into what makes a quantum computers more powerful," he continued. "And hopefully in the future it will lead to more practical, useful algorithms."

Slashdot Top Deals