×
Businesses

VMware Buys Carbon Black and Pivotal, Valued Together at $4.8 billion (cnbc.com) 12

Software company VMware on Thursday said it's acquiring Carbon Black at an enterprise value of $2.1 billion and Pivotal at an enterprise value of $2.7 billion. The deals are expected to close by the end of January 2020. From a report: These are VMware's largest acquisitions yet. The deals build on VMware's strength helping companies run their software in their own data centers. They could help VMware compete better in the security market and hybrid-cloud infrastructure operations. VMware isn't talking about cost synergies that could come out of buying two other enterprise-focused companies. However, CEO Pat Gelsinger told CNBC the companies will be operating profitably under VMware next year. Gelsinger said that by year two, Carbon Black and Pivotal will have contributed more than $1 billion in revenue incrementally, which will mean VMware will have more than $3 billion in hybrid cloud and software-as-a-service revenue.

Carbon Black was founded in 2002 and debuted on the Nasdaq under the symbol "CBLK" in May 2018. The company provides anti-malware and endpoint protection products that can see into many of a company's devices and tell if they have been hacked. [...] Pivotal and VMware go way back: The company was created from assets spun out of VMware and Dell (VMware's controlling owner) in 2013. Its products help companies build and deploy their software across different server infrastructure, including public clouds. Competitors include IBM, Oracle and SAP, among others, as well as cloud providers such as Amazon and Microsoft. Pivotal's customers include Boeing, Citi, Ford and Home Depot, according to its website.

Security

Intel, Google, Microsoft, and Others Launch Confidential Computing Consortium for Data Security (venturebeat.com) 44

Major tech companies including Alibaba, Arm, Baidu, IBM, Intel, Google Cloud, Microsoft, and Red Hat today announced intent to form the Confidential Computing Consortium to improve security for data in use. From a report: Established by the Linux Foundation, the organization plans to bring together hardware vendors, developers, open source experts, and others to promote the use of confidential computing, advance common open source standards, and better protect data. "Confidential computing focuses on securing data in use. Current approaches to securing data often address data at rest (storage) and in transit (network), but encrypting data in use is possibly the most challenging step to providing a fully encrypted lifecycle for sensitive data," the Linux Foundation said today in a joint statement. "Confidential computing will enable encrypted data to be processed in memory without exposing it to the rest of the system and reduce exposure for sensitive data and provide greater control and transparency for users."

The consortium also said the group was formed because confidential computing will become more important as more enterprise organizations move between different compute environments like the public cloud, on-premises servers, or the edge. To get things started, companies made a series of open source project contributions including Intel Software Guard Extension (SGX), an SDK for code protection at the hardware layer.

IBM

IBM is Moving OpenPower Foundation To The Linux Foundation (techcrunch.com) 46

IBM makes the Power Series chips, and as part of that has open sourced some of the underlying technologies to encourage wider use of these chips. The open source pieces have been part of the OpenPower Foundation. Today, the company announced it was moving the foundation under The Linux Foundation, and while it was at it, announced it was open sourcing several other important bits. From a report: Ken King, general manager for OpenPower at IBM, says that at this point in his organization's evolution, they wanted to move it under the auspices of the Linux Foundation. But IBM didn't stop there. It also announced that it was open sourcing some of the technical underpinnings of the Power Series chip to make it easier for developers and engineers to build on top of the technology. Perhaps most importantly, the company is open sourcing the Power Instruction Set Architecture (ISA). These are "the definitions developers use for ensuring hardware and software work together on Power," the company explained.

King sees open sourcing this technology as an important step for a number of reasons around licensing and governance. "The first thing is that we are taking the ability to be able to implement what we're licensing, the ISA instruction set architecture, for others to be able to implement on top of that instruction set royalty free with patent rights," he explained. The company is also putting this under an open governance workgroup at the OpenPower Foundation.

Bitcoin

'Mining Bitcoin On a 1983 Apple II: a Highly Impractical Guide' (retroconnector.com) 42

option8 ((Slashdot reader #16,509) writes: TL;DR: Mining Bitcoin on a 1MHz 8-bit processor will cost you more than the world's combined economies, and take roughly 256 trillion years.
"But it could happen tomorrow. It's a lottery, after all," explains the blog post (describing this mad scientist as a hardware hacker and "self-taught maker", determined to mine bitcoin "in what must be the slowest possible way. I call it 8BITCOIN....")

There's also a Twitch.TV stream, with some appropriate 8-bit music, and the blog post ends by including his own bitcoin address, "If you feel like you absolutely must throw some money at me and this project."

"Upon doing some research, I found that, not only were other 8-bit platforms being put to the task, but other, even more obscure and outdated hardware. An IBM 1401 from the 1960s, a rebuilt Apollo Guidance Computer, and even one deranged individual who demonstrated the hashing algorithm by hand. It turns out, those examples all come from the same deranged individual, Ken Shirriff."
Science

Chemists Make First-Ever Ring of Pure Carbon (nature.com) 37

A team of researchers has synthesized the first ring-shaped molecule of pure carbon -- a circle of 18 atoms. Nature reports: The chemists started with a triangular molecule of carbon and oxygen, which they manipulated with electric currents to create the carbon-18 ring. Initial studies of the properties of the molecule, called a cyclocarbon, suggest that it acts as a semiconductor, which could make similar straight carbon chains useful as molecular-scale electronic components.

Chemist Przemyslaw Gawel of the University of Oxford, UK, and his collaborators have now created and imaged the long-sought ring molecule carbon-18. Using standard 'wet' chemistry, his collaborator Lorel Scriven, an Oxford chemist, first synthesized molecules that included four-carbon squares coming off the ring with oxygen atoms attached to squares. The team then sent their samples to IBM laboratories in Zurich, Switzerland, where collaborators put the oxygen -- carbon molecules on a layer of sodium chloride, inside a high-vacuum chamber. They manipulated the rings one at a time with electric currents (using an atomic-force microscope that can also act as a scanning-transmission microscope), to remove the extraneous, oxygen-containing parts. After much trial-and-error, micrograph scans revealed the 18-carbon structure. "I never thought I would see this," says Scriven. Alternating bond types are interesting because they are supposed to give carbon chains and rings the properties of semiconductors. The results suggest that long, straight carbon chains might be semiconductors, too, Gawel says, which could make them useful as components of future molecular-sized transistors.
The paper has been published in the journal Science.
AMD

AMD Poses 'Major Challenge' to Intel's Server Leadership (eweek.com) 75

Rob Enderle reports on the excitement at AMD's Epyc processor launch in San Francisco: I've been at a lot of AMD events, and up until this one, the general message was that AMD was almost as good as Intel but not as expensive. This year it is very different; Intel has stumbled badly, and AMD is moving to take the leadership role in the data center, so its message isn't that it is nearly as good but cheaper anymore; it is that it has better customer focus, better security and better performance. Intel's slip really was around trust, and as Intel seemed to abandon the processor segment, OEMs and customers lost faith, and AMD is capitalizing on that slip...

AMD has always been relatively conservative, but Lisa Su, AMD's CEO, stated that the company has broken 80 performance records and that this new processor is the highest-performing one in the segment. This is one thing Lisa's IBM training helps validate; I went through that training myself and, at IBM, you aren't allowed to make false claims. AMD isn't making a false claim here. The new Epyc 2 is 64 cores and 128 threads and with PCIe generation 4, it has 128 lanes on top its 7nm technology, which currently also appears to lead the market. Over the years the average performance for the data center chips, according to Su, has improved around 15% per year. The last generation of Epyc exceeded this when it launched, but just slightly. This new generation blows the curve out; instead of 15% year-over-year improvement, it is closer to 100%...

Intel has had a number of dire security problems that it didn't disclose in timely fashion, making their largest customers very nervous. AMD is going after this vulnerability aggressively and pointing to how they've uniquely hardened Epyc 2 so that customers that use it have few, if any, of the concerns they've had surrounding Intel parts. Part of this is jumping to more than 500 unique encryption keys tied to the platform.

Besides Google and Twitter, AMD's event also included announcements from Hewlett-Packard Enterprise, Dell, Cray, Lenovo, and Microsoft Azure. For example, Hewlett Packard Enterprise has three systems immediately available with AMD's new processor, the article reports, with plan to have 9 more within the next 12 months. And their CTO told the audience that their new systems have already broken 37 world performance records, and "attested to the fact that some of the most powerful supercomputers coming to market will use this processor, because it is higher performing," calling them the most secure in the industry and the highest-performing.

"AMD came to play in San Francisco this week," Enderle writes. "I've never seen it go after Intel this aggressively and, to be frank, this would have failed had it not been for the massive third-party advocacy behind Epyc 2. I've been in this business since the mid-'80s, and I've never seen this level of advocacy for a new processor ever before. And it was critical that AMD set this new bar; I guess this was an extra record they set, but AMD can legitimately argue that it is the new market leader, at least in terms of both raw and price performance, in the HPC in the server segment.

"I think this also showcases how badly Intel is bleeding support after abandoning the IDF (Intel Developer Forum) conference."
Red Hat Software

Red Hat Joins the RISC-V Foundation (phoronix.com) 49

Red Hat has joined the RISC-V Foundation to help foster this open-source processor ISA. Phoronix reports: While we're still likely years away from seeing any serious RISC-V powered servers at least that can deliver meaningful performance, Red Hat has been active in promoting RISC-V as an open-source processor instruction set architecture and one of the most promising libre architectures we have seen over the years. Red Hat developers have already helped in working on Fedora's RISC-V support and now the IBM-owned company is helping out more and showing their commitment by joining the RISC-V Foundation. Red Hat joins the likes of Google, NVIDIA, Qualcomm, SiFive, Western Digital, IBM, and Samsung as among the many RISC-V members.
Security

With Warshipping, Hackers Ship Their Exploits Directly To Their Target's Mail Room (techcrunch.com) 79

Why break into a company's network when you can just walk right in-- literally? From a report: Gone could be the days of having to find a zero-day vulnerability in a target's website, or having to scramble for breached usernames and passwords to break through a company's login pages. And certainly there will be no need to park outside a building and brute-force the Wi-Fi network password. Just drop your exploit in the mail and let your friendly postal worker deliver it to your target's door. This newly named technique -- dubbed "warshipping" -- is not a new concept. Just think of the traditional Trojan horse rolling into the city of Troy, or when hackers drove up to TJX stores and stole customer data by breaking into the store's Wi-Fi network.

But security researchers at IBM's X-Force Red say it's a novel and effective way for an attacker to gain an initial foothold on a target's network. "It uses disposable, low cost and low power computers to remotely perform close-proximity attacks, regardless of the cyber criminal's location," wrote Charles Henderson, who heads up the IBM offensive operations unit.

Red Hat Software

Final Red Hat Enterprise Linux 7 Version Released (zdnet.com) 69

The last RHEL release, RHEL 7.7, is now available for current Red Hat Enterprise Linux subscribers via the Red Hat Customer Portal. ZDNet reports on what's new: RHEL 7.7's most important updates are support for the latest generation of enterprise hardware and remediation for the recently disclosed ZombieLoad vulnerabilities. The latest RHEL 7 also includes network stack performance enhancements. With this release, you can offload virtual switching operations to network interface card (NIC) hardware. What that means for you is, if you're using virtual switching and network function virtualization (NFV), you'll see better network performance on cloud and container platforms such as Red Hat OpenStack Platform and Red Hat OpenShift.

RHEL 7.7 users can also use Red Hat's new predictive problem shooter: Red Hat Insights. This uses a software-as-a-service (SaaS)-based predictive analytics approach to spot, assess, and mitigate potential problems to their systems before they can cause trouble. For developers, RHEL 7.7 comes with Python 3.6 interpreter, and the pip and setup tools utilities. Previously, Python 3 versions were available only as a part of Red Hat Software Collections. Moving on to the cloud, RHEL 7.7 Red Hat Image Builder is now supported. This feature, which is also in RHEL 8, enables you to easily create custom RHEL system images for cloud and virtualization platforms such as Amazon Web Services (AWS), VMware vSphere, and OpenStack. To help cloud-native developers, RHEL 7.7 includes full support for Red Hat's distributed-container toolkit -- buildah, podman, and skopeo -- on RHEL workstations. After building on the desktop, programmers can use Red Hat Universal Base Image to build, run, and manage containerized applications across the hybrid cloud.

IBM

IBM Fired as Many as 100,000 in Recent Years, Lawsuit Shows (bloomberg.com) 117

International Business Machines (IBM) has fired as many as 100,000 employees in the last few years in an effort to boost its appeal to millennials and make it appear to be as "cool" and "trendy" as Amazon and Google, according to a deposition from a former vice president in an ongoing age discrimination lawsuit. From a report: The technology company is facing several lawsuits accusing it of firing older workers, including a class-action case in Manhattan and individual civil suits filed in California, Pennsylvania and Texas last year. "We have reinvented IBM in the past five years to target higher value opportunities for our clients," IBM said in a statement. "The company hires 50,000 employees each year." Big Blue has struggled with almost seven straight years of shrinking revenue. In the last decade, the company has fired thousands of people in the U.S., Canada and other high-wage jurisdictions in an effort to cut costs and retool its workforce after coming late to the cloud-computing and mobile-tech revolutions. The number of IBM employees has fallen to its lowest point in six years, with 350,600 global workers at the end of 2018 -- a 19% reduction since 2013.
Open Source

When Open Source Software Comes With a Few Catches (wired.com) 120

As open source software grows more popular, and important, developers face an existential question: How to make money from something you give away for free? An anonymous reader shares a report: The Open Source Initiative standards body says an open source license must allow users to view the underlying source code, modify it, and share it as they see fit. Independent developers and large companies alike now routinely release software under these licenses. Many coders believe open collaboration results in better software. Some companies open their code for marketing purposes. Open source software now underpins much technology, from smartphone operating systems to government websites.

Companies that release software under open source licenses generate revenue in different ways. Some sell support, including Red Hat, which IBM acquired for $34 billion earlier this month. Others, like cloud automation company HashiCorp, sell proprietary software based on the open source components. But with the rise of cloud computing, developers see their open source code being bundled into services and sold by other companies. Amazon, for example, sells a cloud-hosted service based on the popular open source database Redis, which competes with a similar cloud-hosted service offered by Redis Labs, the sponsor of the open source project. To protect against such scenarios, companies behind popular open source projects are restricting how others can use their software. Redis Labs started the trend last year when it relicensed several add-ons for its core product under terms that essentially prohibit offering those add-ons as part of a commercial cloud computing service.

That way, Amazon and other cloud providers can't use those add-ons in their competing Redis services. Companies that want the functionality provided by those add-ons need to develop those features themselves, or get permission from Redis Labs. [...] Analytics company Confluent and database maker CockroachDB added similar terms to their licenses, preventing cloud computing companies from using some or all of their code to build competing services. Taking a slightly different tack, MongoDB relicensed its flagship database product last year under a new "Server Side Public License" (SSPL) that requires companies that sell the database system as a cloud service also release the source code of any additional software they include.

AI

IBM Gives Cancer-Killing Drug AI Project To the Open Source Community 42

IBM has released three artificial intelligence (AI) projects tailored to take on the challenge of curing cancer to the open-source community. ZDNet reports: The first project, dubbed PaccMann -- not to be confused with the popular Pac-Man computer game -- is described as the "Prediction of anticancer compound sensitivity with Multi-modal attention-based neural networks." IBM is working on the PaccMann algorithm to automatically analyze chemical compounds and predict which are the most likely to fight cancer strains, which could potentially streamline this process. The ML algorithm exploits data on gene expression as well as the molecular structures of chemical compounds. IBM says that by identifying potential anti-cancer compounds earlier, this can cut the costs associated with drug development.

The second project is called "Interaction Network infErence from vectoR representATions of words," otherwise known as INtERAcT. This tool is a particularly interesting one given its automatic extraction of data from valuable scientific papers related to our understanding of cancer. INtERAcT aims to make the academic side of research less of a burden by automatically extracting information from these papers. At the moment, the tool is being tested on extracting data related to protein-protein interactions -- an area of study which has been marked as a potential cause of the disruption of biological processes in diseases including cancer.

The third and final project is "pathway-induced multiple kernel learning," or PIMKL. This algorithm utilizes datasets describing what we currently know when it comes to molecular interactions in order to predict the progression of cancer and potential relapses in patients. PIMKL uses what is known as multiple kernel learning to identify molecular pathways crucial for categorizing patients, giving healthcare professionals an opportunity to individualize and tailor treatment plans.
United States

Trump Says He's Looking Into a Pentagon Cloud Contract For Amazon or Microsoft (cnbc.com) 121

President Donald Trump said on Thursday that he's seriously considering looking at a Pentagon contract that's said to be worth up to $10 billion for Microsoft or Amazon. From a report: "I never had something where more people are complaining," Trump said, adding that he's going to take a close look at it. "We're getting tremendous complaints from other companies," Trump said in a press pool at the White House during a meeting with the prime minister of The Netherlands. "Some of the greatest companies in the world are complaining about it." He named Microsoft, Oracle and IBM.

Since April, Microsoft and Amazon have been the only remaining competitors for the contract after IBM and Oracle were ruled out by the Defense Department. The contract, known as JEDI, is viewed as a marquee deal for the company that ultimately wins it, particularly as Microsoft and Amazon are aggressively pursuing government work for their expanding cloud units. While Trump didn't cite Amazon CEO Jeff Bezos by name on Thursday, the billionaire executive has been a constant source of frustration for the president. Bezos owns the Washington Post, which Trump regularly criticizes for its coverage of his administration. Trump also has gone after Amazon repeatedly for, as he claims, not paying its fair share of taxes and ripping of the U.S. Post Office.

China

How America's Tech Giants Are Helping Build China's Surveillance State (theintercept.com) 147

"An American organization founded by tech giants Google and IBM is working with a company that is helping China's authoritarian government conduct mass surveillance against its citizens," the Intercept reports.

The OpenPower Foundation -- a nonprofit led by Google and IBM executives with the aim of trying to "drive innovation" -- has set up a collaboration between IBM, Chinese company Semptian, and U.S. chip manufacturer Xilinx. Together, they have worked to advance a breed of microprocessors that enable computers to analyze vast amounts of data more efficiently. Shenzhen-based Semptian is using the devices to enhance the capabilities of internet surveillance and censorship technology it provides to human rights-abusing security agencies in China, according to sources and documents. A company employee said that its technology is being used to covertly monitor the internet activity of 200 million people...

Semptian presents itself publicly as a "big data" analysis company that works with internet providers and educational institutes. However, a substantial portion of the Chinese firm's business is in fact generated through a front company named iNext, which sells the internet surveillance and censorship tools to governments. iNext operates out of the same offices in China as Semptian, with both companies on the eighth floor of a tower in Shenzhen's busy Nanshan District. Semptian and iNext also share the same 200 employees and the same founder, Chen Longsen. [The company's] Aegis equipment has been placed within China's phone and internet networks, enabling the country's government to secretly collect people's email records, phone calls, text messages, cellphone locations, and web browsing histories, according to two sources familiar with Semptian's work.

Promotional documents obtained from the company promise "location information for everyone in the country." One company representative even told the Intercept they were processing "thousands of terabits per second," and -- not knowing they were talking to a reporter -- forwarded a 16-minute video detailing their technology. "If a government operative enters a person's cellphone number, Aegis can show where the device has been over a given period of time: the last three days, the last week, the last month, or longer," the Intercept reports.

Joss Wright, a senior research fellow at the University of Oxford's Internet Institute, told the Intercept that "by any meaningful definition, this is a vast surveillance effort."

Read what the U.S. companies had to say about their involvement with Chinese surveillance technology:
The Media

Craigslist Founder: Most Online Outrage is Faked For Profit (theguardian.com) 208

The Guardian profiles 66-year-old Craigslist founder (and former IBM programmer) Craig Newmark, calling him "a survivor from the era of internet optimism."

He's now investing "significant sums" to protect the future of the news industry -- "and rejects the idea his website helped cause journalism's financial crisis" [H]e firmly rejects any notion that all the philanthropy -- an estimated $50m in the past year including to New York Public Radio, new publication the Markup and local journalism efforts such as the American Journalism Project -- is an attempt to assuage guilt, a reach for atonement. "That takes an active imagination that I don't understand. I have very little imagination...."

Newmark, by his own admission not a journalist, says: "I had great hopes for citizen journalism 10, 15 years ago. It hasn't worked out. One reason is that journalism is a profession. You have to know how to write well. You have to fact-check. You have to know how to develop sources, often over years. You have to have specialised knowledge on a beat like disinformation or crime or birds. Citizen journalists can complement what's going on and, sometimes, citizens come to journalism with skills... Now I think more: what are the practical problems of professional journalism? For example, we've seen a couple of cases where bad actors will try to really hurt a publication by engaging in lengthy, frivolous lawsuits. There is a great need for shared risk pool insurance, media insurance in the US, and I talk to people about that...."

Social media fights, he insists, get attention but are not representative of what is really going on. Much of it is manufactured. "Americans are much more reasonable and moderate than what you might guess when you see a little Twitter war. But I'm guessing that the purpose of many Twitter wars is to polarise people and, in fact, we've seen that happen because you can often trace some of the fighting groups to the same location. Outrage is profitable. Most of the outrage I've seen in the online world -- I would guess 80% -- someone's faking it for profit..."

Indeed, he remains convinced that the internet is still a positive for humanity. "It allows people of goodwill to get together and work together for common good...."

The Guardian notes that during their interview, Craig also "cheerfully admits he is 'simulating' social skills."
IBM

IBM Closes Its $34 Billion Acquisition of Red Hat (cnbc.com) 95

IBM closed its $34 billion acquisition of Red Hat, the companies announced Tuesday. From a report: The deal was originally announced in October, when the companies said IBM would buy all shares in Red Hat for $190 each in cash. The acquisition of Red Hat, an open-source, enterprise software maker, marks the close of IBM's largest deal ever. It's one of the biggest in U.S. tech history. Excluding the AOL-Time Warner merger, it follows the $67 billion deal between Dell and EMC in 2016 and JDS Uniphase's $41 billion acquisition of optical-component supplier SDL in 2000. Under the deal, Red Hat will now be a unit of IBM's hybrid cloud division, according to the original announcement. The companies said Red Hat's CEO, Jim Whitehurst, would join IBM's senior management team and report to CEO Ginni Rometty. IBM previously said it hoped its acquisition of Red Hat will help it do more work in the cloud, one of its four key growth drivers, which are also social, mobile and analytics. The company lags behind Amazon and Microsoft in the cloud infrastructure business. IBM has seen three consecutive quarters of declining year-over-year revenue. But some analysts are hopeful about the Red Hat deal's opportunity to bring in new business.
Moon

What You Didn't Know About the Apollo 11 Mission (smithsonianmag.com) 133

"From JFK's real motives to the Soviets' secret plot to land on the Moon at the same time, a new behind-the-scenes view of an unlikely triumph 50 years ago," writes schwit1 sharing a new article from Smithsonian magazine titled "What You Didn't Know About the Apollo 11 Mission."

It's an excerpt from the recently-released book ONE GIANT LEAP: The Impossible Mission That Flew Us to the Moon. The Moon has a smell. It has no air, but it has a smell... All the astronauts who walked on the Moon noticed it, and many commented on it to Mission Control.... Cornell University astrophysicist Thomas Gold warned NASA that the dust had been isolated from oxygen for so long that it might well be highly chemically reactive. If too much dust was carried inside the lunar module's cabin, the moment the astronauts repressurized it with air and the dust came into contact with oxygen, it might start burning, or even cause an explosion. (Gold, who correctly predicted early on that the Moon's surface would be covered with powdery dust, also had warned NASA that the dust might be so deep that the lunar module and the astronauts themselves could sink irretrievably into it.) Among the thousands of things they were keeping in mind while flying to the Moon, Armstrong and Aldrin had been briefed about the very small possibility that the lunar dust could ignite....

The Apollo spacecraft ended up with what was, for its time, the smallest, fastest and most nimble computer in a single package anywhere in the world. That computer navigated through space and helped the astronauts operate the ship. But the astronauts also traveled to the Moon with paper star charts so they could use a sextant to take star sightings -- like 18th-century explorers on the deck of a ship -- and cross-check their computer's navigation. The software of the computer was stitched together by women sitting at specialized looms -- using wire instead of thread. In fact, an arresting amount of work across Apollo was done by hand: The heat shield was applied to the spaceship by hand with a fancy caulking gun; the parachutes were sewn by hand, and then folded by hand. The only three staff members in the country who were trained and licensed to fold and pack the Apollo parachutes were considered so indispensable that NASA officials forbade them to ever ride in the same car, to avoid their all being injured in a single accident. Despite its high-tech aura, we have lost sight of the extent to which the lunar mission was handmade...

The space program in the 1960s did two things to lay the foundation of the digital revolution. First, NASA used integrated circuits -- the first computer chips -- in the computers that flew the Apollo command module and the Apollo lunar module. Except for the U.S. Air Force, NASA was the first significant customer for integrated circuits. Microchips power the world now, of course, but in 1962 they were little more than three years old, and for Apollo they were a brilliant if controversial bet. Even IBM decided against using them in the company's computers in the early 1960s. NASA's demand for integrated circuits, and its insistence on their near-flawless manufacture, helped create the world market for the chips and helped cut the price by 90 percent in five years. NASA was the first organization of any kind -- company or government agency -- anywhere in the world to give computer chips responsibility for human life. If the chips could be depended on to fly astronauts safely to the Moon, they were probably good enough for computers that would run chemical plants or analyze advertising data.

The article also notes that three times as many people worked on Apollo as on the Manhattan Project to create the atomic bomb.
Science

Scientists Took an MRI Scan of an Atom (nytimes.com) 49

The hospital technology, typically used to identify human ailments, captured perhaps the world's smallest magnetic resonance image. weiserfireman shares a report: Different microscopy techniques allow scientists to see the nucleotide-by-nucleotide genetic sequences in cells down to the resolution of a couple atoms as seen in an atomic force microscopy image. But scientists at the IBM Almaden Research Center in San Jose, Calif. and the Institute for Basic Sciences in Seoul, have taken imaging a step further, developing a new magnetic resonance imaging technique that provides unprecedented detail, right down to the individual atoms of a sample [Editor's note: the link may be paywalled; alternative source]. The technique relies on the same basic physics behind the M.R.I. scans that are done in hospitals. When doctors want to detect tumors, measure brain function or visualize the structure of joints, they employ huge M.R.I. machines, which apply a magnetic field across the human body. This temporarily disrupts the protons spinning in the nucleus of every atom in every cell. A subsequent, brief pulse of radio-frequency energy causes the protons to spin perpendicular to the pulse. Afterward, the protons return to their normal state, releasing energy that can be measured by sensors and made into an image.

But to gather enough diagnostic data, traditional hospital M.R.I.s must scan billions and billions of protons in a person's body, said Christopher Lutz, a physicist at IBM. So he and his colleagues decided to pack the power of an M.R.I. machine into the tip of another specialized instrument known as a scanning tunneling microscope to see if they could image individual atoms. The tip of a scanning tunneling microscope is just a few atoms wide. And it moves along the surface of a sample, it picks up details about the size and conformation of molecules. The researchers attached magnetized iron atoms to the tip, effectively combining scanning-tunneling microscope and M.R.I. technologies.

China

Eight of the World's Biggest Technology Service Providers Were Hacked by Chinese Cyber Spies in an Elaborate and Years-Long Invasion (reuters.com) 99

The invasion exploited weaknesses in those companies, their customers, and the Western system of technological defense, Reuters reported on Wednesday. From the report: Hacked by suspected Chinese cyber spies five times from 2014 to 2017, security staff at Swedish telecoms equipment giant Ericsson had taken to naming their response efforts after different types of wine. Pinot Noir began in September 2016. After successfully repelling a wave of attacks a year earlier, Ericsson discovered the intruders were back. And this time, the company's cybersecurity team could see exactly how they got in: through a connection to information-technology services supplier Hewlett Packard Enterprise. Teams of hackers connected to the Chinese Ministry of State Security had penetrated HPE's cloud computing service and used it as a launchpad to attack customers, plundering reams of corporate and government secrets for years in what U.S. prosecutors say was an effort to boost Chinese economic interests.

The hacking campaign, known as "Cloud Hopper," was the subject of a U.S. indictment in December that accused two Chinese nationals of identity theft and fraud. Prosecutors described an elaborate operation that victimized multiple Western companies but stopped short of naming them. A Reuters report at the time identified two: Hewlett Packard Enterprise and IBM. Yet the campaign ensnared at least six more major technology firms, touching five of the world's 10 biggest tech service providers. Also compromised by Cloud Hopper, Reuters has found: Fujitsu, Tata Consultancy Services, NTT Data, Dimension Data, Computer Sciences Corporation and DXC Technology. HPE spun-off its services arm in a merger with Computer Sciences Corporation in 2017 to create DXC.

Science

How To Evaluate Computers That Don't Quite Exist (sciencemag.org) 27

sciencehabit writes: To gauge the performance of a supercomputer, computer scientists turn to a standard tool: a set of algorithms called LINPACK that tests how fast the machine solves problems with huge numbers of variables. For quantum computers, which might one day solve certain problems that overwhelm conventional computers, no such benchmarking standard exists. One reason is that the computers, which aim to harness the laws of quantum mechanics to accelerate certain computations, are still rudimentary, with radically different designs contending. In some, the quantum bits, or qubits, needed for computation are embodied in the spin of strings of trapped ions, whereas others rely on patches of superconducting metal resonating with microwaves. Comparing the embryonic architectures "is sort of like visiting a nursery school to decide which of the toddlers will become basketball stars," says Scott Aaronson, a computer scientist at the University of Texas in Austin.

Yet researchers are making some of their first attempts to take the measure of quantum computers. Last week, Margaret Martonosi, a computer scientist at Princeton University, and colleagues presented a head-to-head comparison of quantum computers from IBM, Rigetti Computing in Berkeley, California, and the University of Maryland (UMD) in College Park. The UMD machine, which uses trapped ions, ran a majority of 12 test algorithms more accurately than the other superconducting machines, the team reported at the International Symposium on Computer Architecture in Phoenix. Christopher Monroe, a UMD physicist and founder of the company IonQ, predicts such comparisons will become the standard. "These toy algorithms give you a simple answer -- did it work or not?" But even Martonosi warns against making too much of the tests. In fact, the analysis underscores how hard it is to compare quantum computers -- which leaves room for designers to choose metrics that put their machines in a favorable light.

Slashdot Top Deals