Privacy

Silicon Valley Is Terrified of California's Privacy Law (techcrunch.com) 155

An anonymous reader quotes a report from TechCrunch: Silicon Valley is terrified. In a little over three months, California will see the widest-sweeping state-wide changes to its privacy law in years. California's Consumer Privacy Act (CCPA) kicks in on January 1 and rolls out sweeping new privacy benefits to the state's 40 million residents -- and every tech company in Silicon Valley. California's law is similar to Europe's GDPR. It grants state consumers a right to know what information companies have on them, a right to have that information deleted and the right to opt-out of the sale of that information.

Since the law passed, tech giants have pulled out their last card: pushing for an overarching federal bill. In doing so, the companies would be able to control their messaging through their extensive lobbying efforts, allowing them to push for a weaker statute that would nullify some of the provisions in California's new privacy law. In doing so, companies wouldn't have to spend a ton on more resources to ensure their compliance with a variety of statutes in multiple states. Just this month, a group of 51 chief executives -- including Amazon's Jeff Bezos, IBM's Ginni Rometty and SAP's Bill McDermott -- signed an open letter to senior lawmakers asking for a federal privacy bill, arguing that consumers aren't clever enough to "understand rules that may change depending upon the state in which they reside." Then, the Internet Association, which counts Dropbox, Facebook, Reddit, Snap, Uber (and just today ZipRecruiter) as members, also pushed for a federal privacy law. "The time to act is now," said the industry group. If the group gets its wish before the end of the year, the California privacy law could be sunk before it kicks in.
TechNet, a "national, bipartisan network of technology CEOs and senior executives," also demanded a federal privacy law, claiming -- and without providing evidence -- that any privacy law should ensure "businesses can comply with the law while continuing to innovate." Its members include major venture capital firms, including Kleiner Perkins and JC2 Ventures, as well as other big tech giants like Apple, Google, Microsoft, Oracle and Verizon

"It's no accident that the tech industry launched this campaign right after the California legislature rejected their attempts to undermine the California Consumer Privacy Act," Jacob Snow, a technology and civil liberties attorney at the ACLU of Northern California, told TechCrunch. "Instead of pushing for federal legislation that wipes away state privacy law, technology companies should ensure that Californians can fully exercise their privacy rights under the CCPA on January 1, 2020, as the law requires."
IBM

IBM's New 53-qubit Quantum Computer is Its Biggest Yet (cnet.com) 44

IBM's 14th quantum computer is its most powerful so far, a model with 53 of the qubits that form the fundamental data-processing element at the heart of the system. From a report: The system, available online to quantum computing customers in October, is a big step up from the last IBM Q machine with 20 qubits and should help advance the marriage of classical computers with the crazy realm of quantum physics. Quantum computing remains a highly experimental field, limited by the difficult physics of the ultra-small and by the need to keep the machines refrigerated to within a hair's breadth of absolute zero to keep outside disturbances from ruining any calculations.

But if engineers and scientists can continue the progress, quantum computers could help solve computing problems that are, in practice, impossible on today's classical computers. That includes things like simulating the complexities of real-world molecules used in medical drugs and materials science, optimizing financial investment performance, and delivering packages with a minimum of time and fuel.

AI

AI Surveillance is Expanding Worldwide (apnews.com) 28

A growing number of countries are following China's lead in deploying artificial intelligence to track citizens, according to a research group's report published Tuesday. From a report: The Carnegie Endowment for International Peace says at least 75 countries are actively using AI tools such as facial recognition for surveillance. The index of countries where some form of AI surveillance is used includes liberal democracies such as the United States and France as well as more autocratic regimes. Relying on a survey of public records and media reports, the report says Chinese tech companies led by Huawei and Hikvision are supplying much of the AI surveillance technology to countries around the world. Other companies such as Japan's NEC and U.S.-based IBM, Palantir and Cisco are also major international providers of AI surveillance tools.

Hikvision declined comment Tuesday. The other companies mentioned in the report didn't immediately return requests for comment. The report encompasses a broad range of AI tools that have some public safety component. The group's index doesn't distinguish between legitimate public safety tools and unlawful or harmful uses such as spying on political opponents. "I hope citizens will ask tougher questions about how this type of technology is used and what type of impacts it will have," said the report's author, Steven Feldstein, a Carnegie Endowment fellow and associate professor at Boise State University. Many of the projects cited in Feldstein's report are "smart city" systems in which a municipal government installs an array of sensors, cameras and other internet-connected devices to gather information and communicate with one another.

Java

Java EE 'Goes All In' on Open Source with Jakarta EE 8 (zdnet.com) 54

An anonymous reader quotes ZDNet: While Sun open-sourced some of Java as long ago as November 2006, actually using Java in an open-source way was... troublesome. Just ask Google about Android and Java. But for Java in the enterprise things have changed. On September 10, The Eclipse Foundation announced the full open-source release of the Jakarta EE 8 Full Platform and Web Profile specifications and related Technology Compatibility Kits (TCKs).

This comes after Oracle let go of most of Java Enterprise Edition's (JEE) intellectual property. Oracle retains Java's trademarks though -- thus Java EE's naming convention has been changed to Jakarta EE. But for practical programming and production purposes Jakarta EE 8 is the next generation of enterprise Java.... Jakarta EE 8 also includes the same APIs and Javadoc using the same programming model Java developers have always used. The Jakarta EE 8 TCKs are based on and fully compatible with Java EE 8 TCKs. All of this means enterprise customers will be able to migrate to Jakarta EE 8 without any changes to Java EE 8 applications.

Eclipse hasn't been doing this in a vacuum. Fujitsu, IBM, Oracle, Payara, Red Hat, Tomitribe, and other members of what was once the Java community have been working on Jakarta EE... All of the Jakarta EE Working Group vendors intend to certify their Java EE 8 compatible implementations as Jakarta EE 8 compatible. In other words, Jakarta is the future for Java EE.

Oracle is now working on delivering a Java EE 8 and Jakarta EE 8 compatible implementation of their WebLogic Server.

The Eclipse Foundation says Jakarta EE 8's release "provides a new baseline for the evolution and innovation of enterprise Java technologies under an open, vendor-neutral, community-driven process."
Businesses

51 Tech CEOs Send Open Letter To Congress Asking For a Federal Data Privacy Law (zdnet.com) 35

The chief executive officers (CEOs) of 51 tech companies have signed and sent an open letter to Congress leaders today, asking for a federal law on user data privacy to supersede the rising number of privacy laws that are cropping up at the state level. From a report: The open-letter was sent on behalf of Business Roundtable, an association made up of the CEOs of America's largest companies. The CEOs of Amazon, AT&T, Dell, IBM, Qualcomm, SAP, Salesforce, Visa, Mastercard, JP Morgan Chase, State Farm, and Walmart, are just some of the execs who put their name on the dotted line. CEOs blamed a patchwork of differing privacy regulations that are currently being passed in multiple US states, and by several US agencies, as one of the reasons why consumer privacy is a mess in the US.

This patchwork of privacy regulations is creating problems for their companies, which have to comply with an ever-increasing number of laws across different states and jurisdictions. Instead, the 51 CEOs would like one law that governs all user privacy and data protection across the US, which would simplify product design, compliance, and data management. "There is now widespread agreement among companies across all sectors of the economy, policymakers and consumer groups about the need for a comprehensive federal consumer data privacy law that provides strong, consistent protections for American consumers," the open letter said.

Businesses

Amazon Employees Are Walking Out Over the Company's Huge Carbon Footprint (vice.com) 89

An anonymous reader quotes a report from Motherboard: With less than two weeks until the global climate strikes, 930 tech and corporate Amazon employees have pledged to walk out of the company's offices on September 20, demanding zero emissions by 2030. The climate action, which follows strikes at Amazon warehouses, most recently on Amazon Prime Day, marks the first time white collar Amazon employees have staged a walkout.

Workers with the group Amazon Employees for Climate Justice are demanding that the company adopt a resolution to eliminate its massive carbon footprint by 2030. Compared to other tech giants, Amazon, which ships billions of packages each year and controls a huge portion of the cloud computing market, has come under particular scrutiny for its carbon emissions. On September 20, the biggest day of the weeklong climate strike taking place in 117 countries, Amazon employees in Seattle will walk out of their offices at 11:30 a.m., gather at the giant glass spheres at the center of Amazon's corporate campus, then march to city hall to rally with youth climate activists. An internal call for action that began circulating among employees on September 4 received around 930 pledges as of September 8, Read said.
"Playing a significant role in helping to reduce the sources of human-induced climate change is an important commitment for Amazon," an Amazon spokesperson told Motherboard in response to news of the walkout. "We have dedicated sustainability teams who have been working for years on initiatives to reduce our environmental impact. Earlier this year, we announced Shipment Zero - Amazon's vision to make all Amazon shipments net zero carbon, with 50% of all shipments net zero by 2030."

"Over the past decade through our sustainable packaging programs, we've eliminated more than 244,000 tons of packaging materials and avoided 500 million shipping boxes," the Amazon spokesperson continued. "To track our progress on this journey and as part of an overall commitment to sharing our sustainability goals, we plan to share Amazon's company-wide carbon footprint, along with related goals and programs, later this year. This follows an extensive project over the past few years to develop an advanced scientific model to carefully map our carbon footprint to provide our business teams with detailed information helping them identify ways to reduce carbon use in their businesses."

Microsoft workers in Seattle will also participate in the walkout. Google workers could join too.
Supercomputing

University of Texas Announces Fastest Academic Supercomputer In the World (utexas.edu) 31

On Tuesday the University of Texas at Texas launched the fastest supercomputer at any academic facility in the world.

The computer -- named "Frontera" -- is also the fifth most-powerful supercomputer on earth. Slashdot reader aarondubrow quotes their announcement: The Texas Advanced Computing Center (TACC) at The University of Texas is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm...

Joined by representatives from the National Science Foundation (NSF) -- which funded the system with a $60 million award -- UT Austin, and technology partners Dell Technologies, Intel, Mellanox Technologies, DataDirect Networks, NVIDIA, IBM, CoolIT and Green Revolution Cooling, TACC inaugurated a new era of academic supercomputing with a resource that will help the nation's top researchers explore science at the largest scale and make the next generation of discoveries.

"Scientific challenges demand computing and data at the largest and most complex scales possible. That's what Frontera is all about," said Jim Kurose, assistant director for Computer and Information Science and Engineering at NSF. "Frontera's leadership-class computing capability will support the most computationally challenging science applications that U.S. scientists are working on today."

Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.

Here's more technical details from the announcement about just how fast this supercomputer really is.
Amiga

Ask Slashdot: What Would Computing Look Like Today If the Amiga Had Survived? 221

dryriver writes: The Amiga was a remarkable machine at the time it was released -- 1985. It had a multitasking capable GUI-driven OS and a mouse. It had a number of cleverly designed custom chips that gave the Amiga amazing graphics and sound capabilities far beyond the typical IBM/DOS PCs of its time. The Amiga was the multimedia beast of its time -- you could create animated and still 2D or 3D graphics on it, compose sophisticated electronic music, develop 2D or 3D 16-Bit games, edit and process digital video (using Video Toaster), and of course, play some amazing games. And after the Amiga -- as well as the Atari ST, Archimedes and so on -- died, everybody pretty much had to migrate to either the PC or Mac platforms. If Commodore and the Amiga had survived and thrived, there might have been four major desktop platforms in use today: Windows, OSX, AmigaOS and Linux. And who knows what the custom chips (ASICs? FPGAs?) of an Amiga in 2019 might have been capable of -- Amiga could possibly have been the platform that makes nearly life-like games and VR/AR a reality, and given Nvidia and AMD's GPUs a run for their money.

What do you think the computing landscape in 2019 would have looked like if the Amiga and AmigaOS as a platform had survived? Would Macs be as popular with digital content creators as they are today? Would AAA games target Windows 7/8/10 by default or tilt more towards the Amiga? Could there have been an Amiga hardware-based game console? Might AmigaOS and Linux have had a symbiotic existence of sorts, with AmigOS co-existing with Linux on many enthusiast's Amigas, or even becoming compatible with each other over time?
Businesses

VMware Buys Carbon Black and Pivotal, Valued Together at $4.8 billion (cnbc.com) 12

Software company VMware on Thursday said it's acquiring Carbon Black at an enterprise value of $2.1 billion and Pivotal at an enterprise value of $2.7 billion. The deals are expected to close by the end of January 2020. From a report: These are VMware's largest acquisitions yet. The deals build on VMware's strength helping companies run their software in their own data centers. They could help VMware compete better in the security market and hybrid-cloud infrastructure operations. VMware isn't talking about cost synergies that could come out of buying two other enterprise-focused companies. However, CEO Pat Gelsinger told CNBC the companies will be operating profitably under VMware next year. Gelsinger said that by year two, Carbon Black and Pivotal will have contributed more than $1 billion in revenue incrementally, which will mean VMware will have more than $3 billion in hybrid cloud and software-as-a-service revenue.

Carbon Black was founded in 2002 and debuted on the Nasdaq under the symbol "CBLK" in May 2018. The company provides anti-malware and endpoint protection products that can see into many of a company's devices and tell if they have been hacked. [...] Pivotal and VMware go way back: The company was created from assets spun out of VMware and Dell (VMware's controlling owner) in 2013. Its products help companies build and deploy their software across different server infrastructure, including public clouds. Competitors include IBM, Oracle and SAP, among others, as well as cloud providers such as Amazon and Microsoft. Pivotal's customers include Boeing, Citi, Ford and Home Depot, according to its website.

Security

Intel, Google, Microsoft, and Others Launch Confidential Computing Consortium for Data Security (venturebeat.com) 44

Major tech companies including Alibaba, Arm, Baidu, IBM, Intel, Google Cloud, Microsoft, and Red Hat today announced intent to form the Confidential Computing Consortium to improve security for data in use. From a report: Established by the Linux Foundation, the organization plans to bring together hardware vendors, developers, open source experts, and others to promote the use of confidential computing, advance common open source standards, and better protect data. "Confidential computing focuses on securing data in use. Current approaches to securing data often address data at rest (storage) and in transit (network), but encrypting data in use is possibly the most challenging step to providing a fully encrypted lifecycle for sensitive data," the Linux Foundation said today in a joint statement. "Confidential computing will enable encrypted data to be processed in memory without exposing it to the rest of the system and reduce exposure for sensitive data and provide greater control and transparency for users."

The consortium also said the group was formed because confidential computing will become more important as more enterprise organizations move between different compute environments like the public cloud, on-premises servers, or the edge. To get things started, companies made a series of open source project contributions including Intel Software Guard Extension (SGX), an SDK for code protection at the hardware layer.

IBM

IBM is Moving OpenPower Foundation To The Linux Foundation (techcrunch.com) 46

IBM makes the Power Series chips, and as part of that has open sourced some of the underlying technologies to encourage wider use of these chips. The open source pieces have been part of the OpenPower Foundation. Today, the company announced it was moving the foundation under The Linux Foundation, and while it was at it, announced it was open sourcing several other important bits. From a report: Ken King, general manager for OpenPower at IBM, says that at this point in his organization's evolution, they wanted to move it under the auspices of the Linux Foundation. But IBM didn't stop there. It also announced that it was open sourcing some of the technical underpinnings of the Power Series chip to make it easier for developers and engineers to build on top of the technology. Perhaps most importantly, the company is open sourcing the Power Instruction Set Architecture (ISA). These are "the definitions developers use for ensuring hardware and software work together on Power," the company explained.

King sees open sourcing this technology as an important step for a number of reasons around licensing and governance. "The first thing is that we are taking the ability to be able to implement what we're licensing, the ISA instruction set architecture, for others to be able to implement on top of that instruction set royalty free with patent rights," he explained. The company is also putting this under an open governance workgroup at the OpenPower Foundation.

Bitcoin

'Mining Bitcoin On a 1983 Apple II: a Highly Impractical Guide' (retroconnector.com) 42

option8 ((Slashdot reader #16,509) writes: TL;DR: Mining Bitcoin on a 1MHz 8-bit processor will cost you more than the world's combined economies, and take roughly 256 trillion years.
"But it could happen tomorrow. It's a lottery, after all," explains the blog post (describing this mad scientist as a hardware hacker and "self-taught maker", determined to mine bitcoin "in what must be the slowest possible way. I call it 8BITCOIN....")

There's also a Twitch.TV stream, with some appropriate 8-bit music, and the blog post ends by including his own bitcoin address, "If you feel like you absolutely must throw some money at me and this project."

"Upon doing some research, I found that, not only were other 8-bit platforms being put to the task, but other, even more obscure and outdated hardware. An IBM 1401 from the 1960s, a rebuilt Apollo Guidance Computer, and even one deranged individual who demonstrated the hashing algorithm by hand. It turns out, those examples all come from the same deranged individual, Ken Shirriff."
Science

Chemists Make First-Ever Ring of Pure Carbon (nature.com) 37

A team of researchers has synthesized the first ring-shaped molecule of pure carbon -- a circle of 18 atoms. Nature reports: The chemists started with a triangular molecule of carbon and oxygen, which they manipulated with electric currents to create the carbon-18 ring. Initial studies of the properties of the molecule, called a cyclocarbon, suggest that it acts as a semiconductor, which could make similar straight carbon chains useful as molecular-scale electronic components.

Chemist Przemyslaw Gawel of the University of Oxford, UK, and his collaborators have now created and imaged the long-sought ring molecule carbon-18. Using standard 'wet' chemistry, his collaborator Lorel Scriven, an Oxford chemist, first synthesized molecules that included four-carbon squares coming off the ring with oxygen atoms attached to squares. The team then sent their samples to IBM laboratories in Zurich, Switzerland, where collaborators put the oxygen -- carbon molecules on a layer of sodium chloride, inside a high-vacuum chamber. They manipulated the rings one at a time with electric currents (using an atomic-force microscope that can also act as a scanning-transmission microscope), to remove the extraneous, oxygen-containing parts. After much trial-and-error, micrograph scans revealed the 18-carbon structure. "I never thought I would see this," says Scriven. Alternating bond types are interesting because they are supposed to give carbon chains and rings the properties of semiconductors. The results suggest that long, straight carbon chains might be semiconductors, too, Gawel says, which could make them useful as components of future molecular-sized transistors.
The paper has been published in the journal Science.
AMD

AMD Poses 'Major Challenge' to Intel's Server Leadership (eweek.com) 75

Rob Enderle reports on the excitement at AMD's Epyc processor launch in San Francisco: I've been at a lot of AMD events, and up until this one, the general message was that AMD was almost as good as Intel but not as expensive. This year it is very different; Intel has stumbled badly, and AMD is moving to take the leadership role in the data center, so its message isn't that it is nearly as good but cheaper anymore; it is that it has better customer focus, better security and better performance. Intel's slip really was around trust, and as Intel seemed to abandon the processor segment, OEMs and customers lost faith, and AMD is capitalizing on that slip...

AMD has always been relatively conservative, but Lisa Su, AMD's CEO, stated that the company has broken 80 performance records and that this new processor is the highest-performing one in the segment. This is one thing Lisa's IBM training helps validate; I went through that training myself and, at IBM, you aren't allowed to make false claims. AMD isn't making a false claim here. The new Epyc 2 is 64 cores and 128 threads and with PCIe generation 4, it has 128 lanes on top its 7nm technology, which currently also appears to lead the market. Over the years the average performance for the data center chips, according to Su, has improved around 15% per year. The last generation of Epyc exceeded this when it launched, but just slightly. This new generation blows the curve out; instead of 15% year-over-year improvement, it is closer to 100%...

Intel has had a number of dire security problems that it didn't disclose in timely fashion, making their largest customers very nervous. AMD is going after this vulnerability aggressively and pointing to how they've uniquely hardened Epyc 2 so that customers that use it have few, if any, of the concerns they've had surrounding Intel parts. Part of this is jumping to more than 500 unique encryption keys tied to the platform.

Besides Google and Twitter, AMD's event also included announcements from Hewlett-Packard Enterprise, Dell, Cray, Lenovo, and Microsoft Azure. For example, Hewlett Packard Enterprise has three systems immediately available with AMD's new processor, the article reports, with plan to have 9 more within the next 12 months. And their CTO told the audience that their new systems have already broken 37 world performance records, and "attested to the fact that some of the most powerful supercomputers coming to market will use this processor, because it is higher performing," calling them the most secure in the industry and the highest-performing.

"AMD came to play in San Francisco this week," Enderle writes. "I've never seen it go after Intel this aggressively and, to be frank, this would have failed had it not been for the massive third-party advocacy behind Epyc 2. I've been in this business since the mid-'80s, and I've never seen this level of advocacy for a new processor ever before. And it was critical that AMD set this new bar; I guess this was an extra record they set, but AMD can legitimately argue that it is the new market leader, at least in terms of both raw and price performance, in the HPC in the server segment.

"I think this also showcases how badly Intel is bleeding support after abandoning the IDF (Intel Developer Forum) conference."
Red Hat Software

Red Hat Joins the RISC-V Foundation (phoronix.com) 49

Red Hat has joined the RISC-V Foundation to help foster this open-source processor ISA. Phoronix reports: While we're still likely years away from seeing any serious RISC-V powered servers at least that can deliver meaningful performance, Red Hat has been active in promoting RISC-V as an open-source processor instruction set architecture and one of the most promising libre architectures we have seen over the years. Red Hat developers have already helped in working on Fedora's RISC-V support and now the IBM-owned company is helping out more and showing their commitment by joining the RISC-V Foundation. Red Hat joins the likes of Google, NVIDIA, Qualcomm, SiFive, Western Digital, IBM, and Samsung as among the many RISC-V members.
Security

With Warshipping, Hackers Ship Their Exploits Directly To Their Target's Mail Room (techcrunch.com) 79

Why break into a company's network when you can just walk right in-- literally? From a report: Gone could be the days of having to find a zero-day vulnerability in a target's website, or having to scramble for breached usernames and passwords to break through a company's login pages. And certainly there will be no need to park outside a building and brute-force the Wi-Fi network password. Just drop your exploit in the mail and let your friendly postal worker deliver it to your target's door. This newly named technique -- dubbed "warshipping" -- is not a new concept. Just think of the traditional Trojan horse rolling into the city of Troy, or when hackers drove up to TJX stores and stole customer data by breaking into the store's Wi-Fi network.

But security researchers at IBM's X-Force Red say it's a novel and effective way for an attacker to gain an initial foothold on a target's network. "It uses disposable, low cost and low power computers to remotely perform close-proximity attacks, regardless of the cyber criminal's location," wrote Charles Henderson, who heads up the IBM offensive operations unit.

Red Hat Software

Final Red Hat Enterprise Linux 7 Version Released (zdnet.com) 69

The last RHEL release, RHEL 7.7, is now available for current Red Hat Enterprise Linux subscribers via the Red Hat Customer Portal. ZDNet reports on what's new: RHEL 7.7's most important updates are support for the latest generation of enterprise hardware and remediation for the recently disclosed ZombieLoad vulnerabilities. The latest RHEL 7 also includes network stack performance enhancements. With this release, you can offload virtual switching operations to network interface card (NIC) hardware. What that means for you is, if you're using virtual switching and network function virtualization (NFV), you'll see better network performance on cloud and container platforms such as Red Hat OpenStack Platform and Red Hat OpenShift.

RHEL 7.7 users can also use Red Hat's new predictive problem shooter: Red Hat Insights. This uses a software-as-a-service (SaaS)-based predictive analytics approach to spot, assess, and mitigate potential problems to their systems before they can cause trouble. For developers, RHEL 7.7 comes with Python 3.6 interpreter, and the pip and setup tools utilities. Previously, Python 3 versions were available only as a part of Red Hat Software Collections. Moving on to the cloud, RHEL 7.7 Red Hat Image Builder is now supported. This feature, which is also in RHEL 8, enables you to easily create custom RHEL system images for cloud and virtualization platforms such as Amazon Web Services (AWS), VMware vSphere, and OpenStack. To help cloud-native developers, RHEL 7.7 includes full support for Red Hat's distributed-container toolkit -- buildah, podman, and skopeo -- on RHEL workstations. After building on the desktop, programmers can use Red Hat Universal Base Image to build, run, and manage containerized applications across the hybrid cloud.

IBM

IBM Fired as Many as 100,000 in Recent Years, Lawsuit Shows (bloomberg.com) 117

International Business Machines (IBM) has fired as many as 100,000 employees in the last few years in an effort to boost its appeal to millennials and make it appear to be as "cool" and "trendy" as Amazon and Google, according to a deposition from a former vice president in an ongoing age discrimination lawsuit. From a report: The technology company is facing several lawsuits accusing it of firing older workers, including a class-action case in Manhattan and individual civil suits filed in California, Pennsylvania and Texas last year. "We have reinvented IBM in the past five years to target higher value opportunities for our clients," IBM said in a statement. "The company hires 50,000 employees each year." Big Blue has struggled with almost seven straight years of shrinking revenue. In the last decade, the company has fired thousands of people in the U.S., Canada and other high-wage jurisdictions in an effort to cut costs and retool its workforce after coming late to the cloud-computing and mobile-tech revolutions. The number of IBM employees has fallen to its lowest point in six years, with 350,600 global workers at the end of 2018 -- a 19% reduction since 2013.
Open Source

When Open Source Software Comes With a Few Catches (wired.com) 120

As open source software grows more popular, and important, developers face an existential question: How to make money from something you give away for free? An anonymous reader shares a report: The Open Source Initiative standards body says an open source license must allow users to view the underlying source code, modify it, and share it as they see fit. Independent developers and large companies alike now routinely release software under these licenses. Many coders believe open collaboration results in better software. Some companies open their code for marketing purposes. Open source software now underpins much technology, from smartphone operating systems to government websites.

Companies that release software under open source licenses generate revenue in different ways. Some sell support, including Red Hat, which IBM acquired for $34 billion earlier this month. Others, like cloud automation company HashiCorp, sell proprietary software based on the open source components. But with the rise of cloud computing, developers see their open source code being bundled into services and sold by other companies. Amazon, for example, sells a cloud-hosted service based on the popular open source database Redis, which competes with a similar cloud-hosted service offered by Redis Labs, the sponsor of the open source project. To protect against such scenarios, companies behind popular open source projects are restricting how others can use their software. Redis Labs started the trend last year when it relicensed several add-ons for its core product under terms that essentially prohibit offering those add-ons as part of a commercial cloud computing service.

That way, Amazon and other cloud providers can't use those add-ons in their competing Redis services. Companies that want the functionality provided by those add-ons need to develop those features themselves, or get permission from Redis Labs. [...] Analytics company Confluent and database maker CockroachDB added similar terms to their licenses, preventing cloud computing companies from using some or all of their code to build competing services. Taking a slightly different tack, MongoDB relicensed its flagship database product last year under a new "Server Side Public License" (SSPL) that requires companies that sell the database system as a cloud service also release the source code of any additional software they include.

AI

IBM Gives Cancer-Killing Drug AI Project To the Open Source Community 42

IBM has released three artificial intelligence (AI) projects tailored to take on the challenge of curing cancer to the open-source community. ZDNet reports: The first project, dubbed PaccMann -- not to be confused with the popular Pac-Man computer game -- is described as the "Prediction of anticancer compound sensitivity with Multi-modal attention-based neural networks." IBM is working on the PaccMann algorithm to automatically analyze chemical compounds and predict which are the most likely to fight cancer strains, which could potentially streamline this process. The ML algorithm exploits data on gene expression as well as the molecular structures of chemical compounds. IBM says that by identifying potential anti-cancer compounds earlier, this can cut the costs associated with drug development.

The second project is called "Interaction Network infErence from vectoR representATions of words," otherwise known as INtERAcT. This tool is a particularly interesting one given its automatic extraction of data from valuable scientific papers related to our understanding of cancer. INtERAcT aims to make the academic side of research less of a burden by automatically extracting information from these papers. At the moment, the tool is being tested on extracting data related to protein-protein interactions -- an area of study which has been marked as a potential cause of the disruption of biological processes in diseases including cancer.

The third and final project is "pathway-induced multiple kernel learning," or PIMKL. This algorithm utilizes datasets describing what we currently know when it comes to molecular interactions in order to predict the progression of cancer and potential relapses in patients. PIMKL uses what is known as multiple kernel learning to identify molecular pathways crucial for categorizing patients, giving healthcare professionals an opportunity to individualize and tailor treatment plans.

Slashdot Top Deals