Australia

Chinese Intelligence Compiles 'Vast Database' About Millions Around the World (abc.net.au) 75

Australia's national public broadcaster ABC reports: A Chinese company with links to Beijing's military and intelligence networks has been amassing a vast database of detailed personal information on thousands of Australians, including prominent and influential figures. A database of 2.4 million people, including more than 35,000 Australians, has been leaked from the Shenzhen company Zhenhua Data which is believed to be used by China's intelligence service, the Ministry of State Security. Zhenhua has the People's Liberation Army and the Chinese Communist Party among its main clients.

Information collected includes dates of birth, addresses, marital status, along with photographs, political associations, relatives and social media IDs. It collates Twitter, Facebook, LinkedIn, Instagram and even TikTok accounts, as well as news stories, criminal records and corporate misdemeanours. While much of the information has been "scraped," some profiles have information which appears to have been sourced from confidential bank records, job applications and psychological profiles.

The company is believed to have sourced some of its information from the so-called "dark web". One intelligence analyst said the database was "Cambridge Analytica on steroids", referring to the trove of personal information sourced from Facebook profiles in the lead up to the 2016 US election campaign. But this data dump goes much further, suggesting a complex global operation using artificial intelligence to trawl publicly available data to create intricate profiles of individuals and organisations, potentially probing for compromise opportunities.

Zhenhua Data's chief executive Wang Xuefeng, a former IBM employee, has used Chinese social media app WeChat to endorse waging "hybrid warfare" through manipulation of public opinion and "psychological warfare"....

The database was leaked to a US academic, who worked with Canberra cyber security company Internet 2.0 and "was able to restore 10 per cent of the 2.4 million records for individuals...

"Of the 250,000 records recovered, there are 52,000 on Americans, 35,000 Australians, 10,000 Indian, 9,700 British, 5,000 Canadians, 2,100 Indonesians, 1,400 Malaysia and 138 from Papua New Guinea."
Java

Oracle's Plan to Keep Java Developers from Leaving for Rust and Kotlin (zdnet.com) 90

ZDNet reports: Oracle has released version 15 of Java, the language created 25 years ago by James Gosling at Sun Microsystems, which Oracle snapped up in 2009 for about $7.4bn to gain what it said was the "most important software Oracle has ever acquired". Java 15, or Oracle Java Development Kit (JDK) 15, brings the Edwards-Curve digital signature algorithm, hidden classes, and former preview features that have been finalized, including text blocks, and the Z Garbage Collector, while the sealed-classes feature arrives and pattern matching and records emerge as a second preview...

In July, Java fell out of RedMonk's top two positions for the first time since 2012 and now resides behind JavaScript and Python in terms of popularity. Tiobe in September ranked Java in second position, behind C and ahead of Python.... But Java is still hugely popular and widely used in the enterprise, according to Oracle, which notes it is used by over 69% of full-time developers worldwide... It counts Arm, Amazon, IBM, Intel, NTT Data, Red Hat, SAP and Tencent among its list of notable contributors to JDK 15. Oracle also gave a special mention to Microsoft and cloud system monitoring service DataDog for fixes...

As part of Java's 25th anniversary, Oracle commissioned analyst firm Omdia to assess its six-month release strategy for Java and whether it would be enough to keep millions of Java developers away from memory-safe alternatives such as Kotlin, the language Google has endorsed for Android development, and Rust, a system programming language that was created at Mozilla. "In Omdia's opinion, the work Oracle began a few years ago in moving to a six-month update cycle and introducing a new level of modularity, puts the vendor in good stead with its constituency of approximately 12 million developers," Oracle said in its report on Omdia's analysis.

"However, Oracle and the Java programming language need an ongoing series of innovative, must-have, and 'delightful' features that make the language even more user friendly and cloud capable. These will keep existing Java developers happy while steering potential Java developers away from newer languages like Rust and Kotlin."

IBM

IBM Publishes its Quantum Roadmap, Says it Will Have a 1,000-qubit Machine in 2023 (techcrunch.com) 35

IBM today, for the first time, published its road map for the future of its quantum computing hardware. There is a lot to digest here, but the most important news in the short term is that the company believes it is on its way to building a quantum processor with more than 1,000 qubits -- and somewhere between 10 and 50 logical qubits -- by the end of 2023. From a report: Currently, the company's quantum processors top out at 65 qubits. It plans to launch a 127-qubit processor next year and a 433-qubit machine in 2022. To get to this point, IBM is also building a completely new dilution refrigerator to house these larger chips, as well as the technology to connect multiple of these units to build a system akin to today's multi-core architectures in classical chips. IBM's Dario Gil tells me that the company made a deliberate choice in announcing this road map and he likened it to the birth of the semiconductor industry.

"If you look at the difference of what it takes to build an industry as opposed to doing a project or doing scientific experiments and moving a field forward, we have had a philosophy that what we needed to do is to build a team that did three things well, in terms of cultures that have to come together. And that was a culture of science, a culture of the road map, and a culture of agile," Gil said. He argues that to reach the ultimate goal of the quantum industry, that is, to build a large-scale, fault-tolerant quantum computer, the company could've taken two different paths. The first would be more like the Apollo program, where everybody comes together, works on a problem for a decade and then all the different pieces come together for this one breakthrough moment.

IBM

IBM Will Feed Four Children For a Day For Every Student Who Masters the Mainframe (ibm.com) 151

This week brings a special event honoring the IBM Z line of mainframes, writes long-time Slashdot reader theodp: As part of this week's IBM Z Day event, looking-for-young-blood IBM is teaming up with tech-backed K-12 CS nonprofits Code.org and CSforALL and calling on students 14-and-up to Master The Mainframe during the 24-hour code-a-thon to open doors to new opportunities with Fortune 500 companies.

"The rewards for participants are substantial," explains Big Blue. "For every student who finishes Level 1, IBM will donate to the UN World Food Programme #ShareTheMeal... In celebration of IBM Z day, we will double the donation for all students that complete Master the Mainframe Level 1 between Sept 15 — 30 2020. Just 1 hour of your time will feed 4 children for a day."
"Through three interactive Levels, you will access a mainframe and get skilled up on the foundations of Mainframe," according to IBM's announcement at MasterTheMainframe.com, "including JCL, Ansible, Python, Unix, COBOL, REXX, all through VS Code. Round it all out with a grand challenge where you craft your own fully-equipped Mainframe creation."

"One mainframe is equivalent to 1,500 x86 servers," the site notes. It also points out that mainframes handle 30 billion transactions every day, "more than the number of Google searches every day" — including 87% of all credit card transactions, nearly $8 trillion payments a year.
IBM

The US Equal Employment Opportunity Commission Confirms a Pattern of Age Discrimination at IBM (propublica.org) 62

The U.S. Equal Employment Opportunity Commission has issued a sweeping decision concluding that IBM engaged in systematic age discrimination between 2013 and 2018, when it shed thousands of older workers in the United States. ProPublica reports: The EEOC finding, contained in an Aug. 31 letter to a group of ex-employees, comes more than two years after ProPublica reported that the company regularly flouted or outflanked laws intended to protect older workers from bias in hiring and firing. The letter says a nationwide EEOC investigation "uncovered top-down messaging from (IBM's) highest ranks directing managers to engage in an aggressive approach to significantly reduce the headcount of older workers to make room for" younger ones.

Employment law experts said the agency's finding could result in IBM facing millions of dollars in settlement payments or a federal lawsuit at a time when the company is under continued competitive pressure and in the midst of a management changeover. Lawyers for some former workers said EEOC investigators have told them the agency decision may apply to more than 6,000 ex-IBM employees, a number that could grow considerably if, as experts say is likely, the agency's finding prompts new, private age discrimination lawsuits.

Wireless Networking

Smart Dust Is Coming. Are You Ready? (forbes.com) 101

"Imagine a world where wireless devices are as small as a grain of salt," writes futurist Bernard Marr in Forbes, describing a technology being researched by companies like IBM, General Electric, and Cisco. "These miniaturized devices have sensors, cameras and communication mechanisms to transmit the data they collect back to a base in order to process.

"Today, you no longer have to imagine it: microelectromechanical systems (MEMS), often called motes, are real and they very well could be coming to a neighborhood near you. Whether this fact excites or strikes fear in you it's good to know what it's all about." Outfitted with miniature sensors, MEMS can detect everything from light to vibrations to temperature. With an incredible amount of power packed into its small size, MEMS combine sensing, an autonomous power supply, computing and wireless communication in a space that is typically only a few millimeters in volume. With such a small size, these devices can stay suspended in an environment just like a particle of dust. They can:

- Collect data including acceleration, stress, pressure, humidity, sound and more from sensors

- Process the data with what amounts to an onboard computer system

- Store the data in memory

- Wirelessly communicate the data to the cloud, a base or other MEMs

Since the components that make up these devices are 3D printed as one piece on a commercially available 3D printer, an incredible amount of complexity can be handled and some previous manufacturing barriers that restricted how small you can make things were overcome. The optical lenses that are created for these miniaturized sensors can achieve the finest quality images.

The potential of smart dust to collect information about any environment in incredible detail could impact plenty of things in a variety of industries from safety to compliance to productivity. It's like multiplying the internet of things technology millions or billions of times over.

Google

To Assuage Fears of Google Domination, Istio Restructures Its Steering Committee (thenewstack.io) 10

An anonymous reader quotes The New Stack: While there are some who may never get over the fact that the Istio service mesh, originally created by Google and IBM, will not be handed over to the Cloud Native Computing Foundation, the project took a big step this past week to assuage those who critiqued the project for being under a Google-majority control: Istio has introduced a new Istio steering committee.

According to the blog post, the new steering committee will consist of 13 seats, with four "elected Community Seats" and nine "proportionally allocated Contribution Seats," a change they say "solidifies our commitment to open governance, ensuring that the community around the project will always be able to steer its direction, and that no one company has majority voting control over the project." This final point is really the key to the announcement here, with them further and more explicitly clarifying later that "no single vendor, no matter how large their contribution, has majority voting control over the Istio project." To this end, they write, they have "implemented a cap on the number of seats a company can hold, such that they can neither unanimously win a vote, or veto a decision of the rest of the committee."

As for how those seats are allocated, the four Community Seats will consist of four representatives from four different organizations and will be chosen in an annual election. The nine Contribution Seats will be assigned to a minimum of three different companies "in proportion to contributions made to Istio in the previous 12 months," with this year's metric being merged pull requests.

But not everyone was satisfied. On Twitter AWS engineer Matthew S. Wilson called it "a crappy way to build a community," objecting to the way it's recognizing and rewarding open source contributions by company rather than by the individuals.

And Knative co-founder Matt Moore called it "what you get when a company wants to 'play community', but treat its employees as interchangeable cogs."
Open Source

'The Future of American Industry Depends On Open Source Tech' (wired.com) 45

An anonymous reader shares an opinion piece from Wired, written by Kevin Xu and Jordan Schneider. Xu is the author of Interconnected, investor and advisor of open source startups at OSS Capital, and served in the Obama White House. Schneider is the author of the ChinaTalk newsletter and host of the ChinaTalk podcast, posted on Lawfare. From the report: Open source is a technology development and distribution methodology, where the codebase and all development -- from setting a roadmap to building new features, fixing bugs, and writing documentation -- is done in public. A governing body (a group of hobbyists, a company, or a foundation) publicly manages this work, which is most often done in a public repository on either GitHub or GitLab. Open source has two important, and somewhat counterintuitive, advantages: speed and security. These practices lead to faster technological developments, because a built-in global community of developers help them mature, especially if the technology is solving a real problem. Top engineers also prefer to work with and on open source projects. Wrongly cast as secretive automatons, they are more often like artists, who prefer to learn, work, collaborate, and showcase what they've built in public, even when they are barely compensated for that work.

But doesn't keeping a technology's codebase open make it more vulnerable to attack? In fact, exposing the codebase publicly for security experts and hackers to easily access and test is the best way to keep the technology secure and build trust with end users for the long haul. Sunlight is the best disinfectant, and open source is that sunlight in technology. Linux, the operating system, and Kubernetes, the cloud container orchestration system, are two of the most prominent examples. [...] Using open source technology is now the fastest way new products get built and legacy technologies get replaced. Yet as US policymakers develop their industrial policy to compete with China, open source is conspicuously absent.

By leaning on the advantages of open source, policymakers can pursue an industrial policy to help the US compete in the 21st century in line with our broader values. The alternative is to continue a top-down process that picks winners and losers based on not just technology but also political influence, which only helps individual firms secure market share, not sparking innovation more broadly. A few billion more dollars won't save Intel from its technical woes, but a healthier ecosystem leveraging open source technology and community would put the US in a better position for the future. Open source technology allows for vendor-neutrality. Whether you're a country or a company, if you use open source, you're not locked in to another company's technical stack, roadmap, or licensing agreements. After Linux was first created in 1991, it was widely adopted by large companies like Dell and IBM as a vendor neutral alternative to Microsoft's Windows operating system. In the future, chip designers won't be locked into Intel or ARM with RISC-V. With OpenRAN, 5G network builders won't be forced to buy from Huawei, Nokia, or Ericsson. [...] By doubling down on open source, America not only can address some of our most pressing technological challenges faster and more securely, but also revive relationships with our allies and deepen productive collaborations with the tech sector.

Open Source

Remembering the Golden Age of Computer User Groups (arstechnica.com) 55

Slashdot reader #16,185 wrote regularly for the newsletter of a small-town computer users group. Now they've written an article for Ars Technica reminding readers that "The Homebrew Computer Club where the Apple I got its start is deservedly famous — but it's far from tech history's only community gathering centered on CPUs." Throughout the 70s and into the 90s, groups around the world helped hapless users figure out their computer systems, learn about technology trends, and discover the latest whiz-bang applications. And these groups didn't stick to Slacks, email threads, or forums; the meetings often happened in real life. But to my dismay, many young technically-inclined whippersnappers are completely unaware of computer user groups' existence and their importance in the personal computer's development. That's a damned shame... Computer groups celebrated the industry's fundamental values: a delight in technology's capabilities, a willingness to share knowledge, and a tacit understanding that we're all here to help one another...

Two things primarily made user groups disappear: first was the Internet — and the BBSes that preceded them. If you could articulate a question, you could find a website with the answer. But computers also became easier to use. Once personal computers went mainstream, troubleshooting them stopped being an esoteric endeavor.

The typical computer user group is gone now. For the exceptions, you can find an incomplete and mostly out-of-date list via the Association of PC User Groups, though online exploration may lead you to more options. For example, the Toronto PET Users Group (TPUG) is the longest continually operating Commodore user group. Washington Apple Pi is still going strong, as is the Triangle Linux Users Group. IBM's user group, SHARE, began in the 1950s and continues to support enterprise users, though it's primarily a conference these days...

Hopefully tech will continue to inspire ways to get together with other people who share your enthusiasm, whether it's Raspberry Pi meetups, Maker days, or open source conferences such as Drupalcon or PyCon. You also continue the computer user group ethos by finding ways to help other tech enthusiasts locally. For example, Hack Club aims to teach skills to high school students. Hack Clubs are already in two percent of US high schools across 35 states and 17 countries, with about 10,000 students attending clubs and hackathons each year.

So even if computer user groups largely are a thing of the past, their benefits live on. User groups were the precursor to the open source community, based on the values of sharing knowledge and helping one another. And who knows, without user groups promoting a cooperative viewpoint, the open source community might never have taken off like it did.

The article includes photographs of the OS/2 community's magazine Extended Attributes, the M.A.C.E. Journal (for Atari users), the Commodore Eight Bit Boosters newsletter, and the 1979 publication Prog/80 ("dedicated to the serious programmer.")

And it also includes video of a 1981 visit to the Boston Computer Society by a 25-year-old Bill Gates.
United Kingdom

Bletchley Park Museum To Layoff a Third of Its Staff (theguardian.com) 48

Long-time Slashdot reader simpz writes: The Guardian is reporting that Bletchley Park Museum is planning to make a third of it's staff redundant. This, of course, the museum of British wartime codebreakers, including famously Alan Turing.

I personally think Google, Amazon, Microsoft, IBM, Facebook, Twitter etc should all chip in to stop this from happening. Without Alan Turing and others, they wouldn't have a business, and to these companies this is small change.

Privacy

Weather Channel App To Change Practices After LA Lawsuit (apnews.com) 32

The operator of The Weather Channel mobile app has agreed to change how it informs users about its location-tracking practices and sale of personal data as part of a settlement with the Los Angeles city attorney's office, officials said Wednesday. From a report: City Attorney Mike Feuer alleged in a 2019 lawsuit that app users were misled when they agreed to share their location information in exchange for personalized forecasts and alerts. Instead, the lawsuit claimed users were unaware they had surrendered personal privacy when the company sold their data to third parties. Feuer announced the settlement Wednesday with the app's operator, TWC Product and Technology LLC, and owner IBM. The app's disclosure screens were initially revised after the lawsuit was filed and future changes that will be monitored by the city attorney's office are planned.

"Users will now clearly know that they have the choice to provide access to their locations," Feuer said at a news conference, adding he hopes other companies will follow the app's model for transparency. "It shows that we don't have to sacrifice our privacy for things of value." IBM bought the app along with the digital assets of The Weather Company in 2015 for $2 billion but did not acquire The Weather Channel seen on TV, which is owned by another company.

Programming

'Real' Programming Is an Elitist Myth (wired.com) 283

When people build a database to manage reading lists or feed their neighbors, that's coding -- and culture. From an essay: We are past the New York City Covid-19 peak. Things have started to reopen, but our neighborhood is in trouble, and people are hungry. There's a church that's opened space for a food pantry, a restaurant owner who has given herself to feeding the neighborhood, and lots of volunteers. [...] It's a complex data model. It involves date fields, text fields, integers, notes. You need lots of people to log in, but you need to protect private data too. You'd think their planning conversations would be about making lots of rice. But that is just a data point. The tool the mutual aid group has settled on to track everything is Airtable, a database-as-a-service program. You log in and there's your database. There are a host of tools like this now, "low-code" or "no-code" software with names like Zapier or Coda or Appy Pie. At first glance these tools look like flowcharts married to spreadsheets, but they're powerful ways to build little data-management apps. Airtable in particular keeps showing up everywhere for managing office supplies or scheduling appointments or tracking who at WIRED has their fingers on this column. The more features you use, the more they charge for it, and it can add up quickly. I know because I see the invoices at my company; we use it to track projects.

"Real" coders in my experience have often sneered at this kind of software, even back when it was just FileMaker and Microsoft Access managing the flower shop or tracking the cats at the animal shelter. It's not hard to see why. These tools are just databases with a form-making interface on top, and with no code in between. It reduces software development, in all its complexity and immense profitability, to a set of simple data types and form elements. You wouldn't build a banking system in it or a game. It lacks the features of big, grown-up databases like Oracle or IBM's Db2 or PostgreSQL. And since it is for amateurs, the end result ends up looking amateur. But it sure does work. I've noticed that when software lets nonprogrammers do programmer things, it makes the programmers nervous. Suddenly they stop smiling indulgently and start talking about what "real programming" is. This has been the history of the World Wide Web, for example. Go ahead and tweet "HTML is real programming," and watch programmers show up in your mentions to go, "As if." Except when you write a web page in HTML, you are creating a data model that will be interpreted by the browser. This is what programming is. Code culture can be solipsistic and exhausting. Programmers fight over semicolon placement and the right way to be object-oriented or functional or whatever else will let them feel in control and smarter and more economically safe, and always I want to shout back: Code isn't enough on its own. We throw code away when it runs out its clock; we migrate data to new databases, so as not to lose one precious bit. Code is a story we tell about data.

IBM

IBM Unveils Power10 Processor for Big Data Analytics and AI Workloads (venturebeat.com) 63

At the Hot Chips 2020 conference, which was held virtually this year, IBM announced the IBM Power10. From a report: It's the successor to the Power9 and represents the next generation of the company's processor family. IBM claims that the Power10 delivers up to three times greater efficiency than its predecessor while at the same time delivering higher workload capacity and container density. The Power10 was designed over five years and has the distinction of being IBM's first commercial 7-nanometer processor. (In 2015, IBM, Samsung, and other members of IBM's Research Alliance produced the first test chips as part of a $3 billion R&D investment.) There will be multiple configurations, and while the specifics aren't yet being disclosed, the maximum single-chip-module offering won't exceed 15 SMT8 cores and the dual-chip-module offering won't exceed 30 SMT8 cores, according to IBM distinguished engineer and Power10 architect William Starke.
Open Source

Red Hat, Google, Microsoft, GitHub, and Others Launch the Open Source Security Foundation (infoq.com) 20

InfoQ reports on a new security group that launched last week: Supported by The Linux Foundation, the Open Source Security Foundation (OpenSSF) aims to create a cross-industry forum for a collaborative effort to improve open source software security. The list of initial members includes Google, Microsoft, GitHub, IBM, Red Hat, and more.

"As open source has become more pervasive, its security has become a key consideration for building and maintaining critical infrastructure that supports mission-critical systems throughout our society. It is more important than ever that we bring the industry together in a collaborative and focused effort to advance the state of open source security. The world's technology infrastructure depends on it."

Microsoft CTO for Azure Mark Russinovich explained clearly why open source security must be a community effort:

"Open-source software is inherently community-driven and as such, there is no central authority responsible for quality and maintenance. [...] Open-source software is also vulnerable to attacks against the very nature of the community, such as attackers becoming maintainers of projects and introducing malware. Given the complexity and communal nature of open source software, building better security must also be a community-driven process."

Also joining the group are Intel, IBM, Uber, and VMWare, according to Foundation's inaugural announcement, which promises its governance and decisions "will be transparent, and any specifications and projects developed will be vendor agnostic."
Education

University of Michigan Study Advocates Ban of Facial Recognition in Schools (venturebeat.com) 18

University of Michigan researchers recently published a study showing facial recognition technology in schools has limited efficacy and presents a number of serious problems. From a report: The research was led by Shobita Parthasarathy, director of the university's Science, Technology, and Public Policy (STPP) program, and finds the technology isn't just ill-suited to security purposes, it can actively promote racial discrimination, normalize surveillance, and erode privacy while marginalizing gender nonconforming students. The study follows the New York legislature's passage of a moratorium on the use of facial recognition and other forms of biometric identification in schools until 2022. The bill, a response to the Lockport City School District launching a facial recognition system, was among the first in the nation to explicitly regulate or ban use of the technology in schools. That development came after companies including Amazon, IBM, and Microsoft halted or ended the sale of facial recognition products in response to the first wave of Black Lives Matter protests in the U.S.
IT

The Workforce Is About to Change Dramatically (theatlantic.com) 106

"For the first time ever, the world's largest companies are telling hundreds of thousands of workers to stay away from the office for a full year, or longer," notes the Atlantic.

"If, in five years, these edicts have no lingering effects on office culture, that would be awfully strange..." Ambitious engineers, media makers, marketers, PR people, and others may be more inclined to strike out on their own, in part because they will, at some point, look around at their living room and realize: I am alone, and I might as well monetize the fact of my independence. A new era of entrepreneurship may be born in America, supercharged by a dash of social-existential angst.

Or, you know, maybe not. If companies find that remote work is a mess, they might decide to prematurely scrap the experiment, like IBM and Yahoo famously did. It is certainly curious that the most prestigious tech companies now proclaiming the future of working from home were, just seven months ago, outfitting their offices with the finest sushi bars, yoga rooms, and massage rooms...

Nothing is certain, and every new trend incurs a backlash. Telepresence could crush some downtown businesses; but cheaper downtown real estate could also lead to a resurgence in interesting new restaurants. Working from home could lead to more free-agent entrepreneurship; but if companies notice that they're bleeding talent, they'll haul their workforces back to headquarters. Still, even a moderate increase in remote work could lead to fundamental changes in our labor force, economy, and politics. Remote workers will spend more money and time inside their houses; they will spend more time with online communities than with colleagues; and many will distribute themselves across the country, rather than feel it necessary to cluster near semi-optional headquarters.

IBM

Frances E. Allen, the First Woman To Win the Turing Award, Dies At 88 (ibm.com) 19

Frances "Fran" Allen, a pioneer in the world of computing, the first female IBM Fellow and the first woman to win the Turing Award, died on August 4, 2020, the day of her 88th birthday. IBM writes in a blog post remembering Allen: As a pioneer in compiler organization and optimization algorithms, Fran made seminal contributions to the world of computing. Her work on inter-procedural analysis and automatic parallelization continues to be on the leading edge of compiler research. She successfully reduced this science to practice through the transfer of this technology to products such as the STRETCH HARVEST Compiler, the COBOL Compiler, and the Parallel FORTRAN Product. As much as Fran will be remembered for her technical vision and her foundational work in computing, she will equally be remembered for her passion to inspire and mentor others, fostering an environment of perseverance and hard work throughout the IBM community.

Starting as a programmer, Fran's first assignment at IBM was to teach the research community FORTRAN, a new complex language IBM had announced just three months before. This was the start of Fran's career-long focus on compilers for high-performance computing. Following FORTRAN, Fran became one of three designers for IBM's Stretch-Harvest project in the late 1950's and early 1960's. As the language liaison with IBM's client, the National Security Agency (NSA), Fran helped design and build Alpha, a very high-level code breaking language which featured the ability to create new alphabets beyond the system defined alphabets.

An Experimental Compiler for IBM's Advanced Computing System (ACS) became her next project. Fran designed and built the machine-independent, language-independent optimizing component of the compiler. The result was a tool to help drive the hardware design and a new way to analyze and transform programs. This work led to Fran's seminal paper on Program Optimization, first published in 1966, describing a robust new framework for implementing program analysis and optimization as well as a powerful set of new algorithms. Fran's 1970 paper on Control Flow analysis introduced the notion of "intervals" and node dominance relations, important improvements over the control flow abstractions given in her earlier paper. Her 1972 paper, "A Catalog of Optimizing Transformations," identified and discussed many of the transformations commonly used today.

United States

White House Unveils Partnership To Boost Quantum Science Education (reuters.com) 67

The White House Office of Science and Technology Policy said on Wednesday the Trump administration is launching a national education partnership to expand access to K-12 quantum information science (QIS) education with major companies and research institutions. From a report: The public-private initiative with the National Science Foundation includes Amazon's Amazon Web Services, Boeing, Alphabet's Google, IBM Corp, Lockheed Martin, Microsoft, the University of Illinois and University of Chicago. The National Science Foundation is also awarding $1 million to QIS education. The initiative is designed in part to help introduce students to quantum information themes before college.
Data Storage

Researchers Use DNA to Store 'The Wizard of Oz' - Translated Into Esperanto (popularmechanics.com) 74

"DNA is millions of times more efficient at storing data than your laptop's magnetic hard drive," reports Popular Mechanics.

"Since DNA can store data far more densely than silicon, you could squeeze all of the data in the world inside just a few grams of it." In a new paper published this week in the journal Proceedings of the National Academy of Sciences, Ilya Finkelstein, an associate professor of molecular biosciences at the University of Texas at Austin and company detail their new error correction method... They were able to store the entirety of The Wizard of Oz, translated into Esperanto, with more accuracy than prior DNA storage methods ever could have. We're on the yellow brick road toward the future of data storage.

Researchers at the University of Texas at Austin are certainly not the first to have encoded a work of art onto strands of DNA... [A] team of researchers from Microsoft and the University of Washington fit 200 megabytes of data onto lengths of DNA, including the entirety of War and Peace. In March 2019, they even came up with the first automated system for storing and retrieving data in the manufactured genetic material. Today, other major technology firms are also working in the space, including both IBM and Google. The ultra-secretive U.S. Intelligence Advanced Research Projects Activity — the government's version of DARPA, but for spies — is even invested in the work. These researchers envision a future where some of the most precious, but rarely accessed data, can be stored in vials of DNA, only pulled down from the cool, dark storage of the lab, as needed....

Because there are four building blocks in DNA, rather than the binary 1s and 0s in magnetic hard drives, the genetic storage method is far more dense, explains John Hawkins, another co-author of the new paper. "A teaspoon of DNA contains so much data it would require about 10 Walmart Supercenter-sized data centers to store using current technology," he tells Popular Mechanics. "Or, as some people like to put it, you could fit the entire internet in a shoe box." Not only that, but DNA is future-proof. Hawkins recalls when CDs were the dominant storage method, back in the 1990s, and they held the promise that their storage could last forever, because plastic does (but scratches can be devastating). Data stored on DNA, on the other hand, can last for hundreds of thousands of years. In fact, there is a whole field of science called archaeogenetics that explores the longevity of DNA to understand the ancient past... DNA storage doesn't require any energy, either — just a cool, dark place to hang out until someone decides to access it. But the greatest advantage, Hawkins says, is that our ability to read and write DNA will never become obsolete....

But like all data storage methods, DNA has a few shortcomings as well. The most significant upfront hurdle is cost. Hawkins says that current methods are similar to the cost for an Apple Hard Disk 20 back in 1980. Back then, about 20 megabytes of storage — or the amount of data you'd need to use to download a 15-minute video — went for about $1,500.

SuSE

SUSE Releases Major Linux Update (zdnet.com) 27

SUSE has released the next versions of its flagship operating system, SUSE Linux Enterprise (SLE) 15 Service Pack 2 and its latest infrastructure management program, SUSE Manager 4.1. ZDNet reports: SLE 15 SP2 is available on the x86-64, Arm, IBM POWER, IBM Z, and LinuxONE hardware architectures. This new Linux server edition is based on the Linux 5.3 kernel. This new kernel release includes upstream features such as utilization clamping support in the task scheduler, and power-efficient userspace waiting. Other new and noteworthy features include:

- Support for migration from openSUSE Leap to SUSE Linux Enterprise Server (SLES). With this, you can try the free, community openSUSE Linux distro, and then, if you find it's a good choice for your business, upgrade to SLES.
- Extended Package Search. By using the new Zypper, SUSE's command line package manager, command option -- zypper search-packages -- sysadmins can now search across all SUSE repositories, even unenabled ones. This makes it easier for administrators to find required software packages.
- SLE Software Development Kit (SDK) is now integrated into SLE. Development packages are packaged alongside regular packages. - Python 3: SLE 15 offers full support for Python 3 development. SLE still supports Python 2 for the time being.
- 389 Directory Server replaces OpenLDAP as the LDAP directory service.
- Repository Mirroring Tool (RMT) replaces Subscription Management Tool (SMT). RMT allows mirroring SUSE repositories and custom repositories. You can then register systems directly with RMT. In environments with tightened security, RMT can also proxy other RMT servers.
- Better business continuity with improved SLE Live Patching. SUSE claims Live Patching increases system uptime by up to 12 months. SLE Live Patching is also now available for IBM Z and LinuxONE mainframe architectures.

As for SUSE Manager 4.1, this is an improved open-source infrastructure management and automation solution that lowers costs, identifies risk, enhances availability, and reduces complexity in edge, cloud, and data center environments. With SUSE Manager you can keep servers, VMs, containers, and clusters secure, healthy, compliant, and low maintenance whether in private, public, or hybrid cloud. That's especially important these days thanks to coronavirus pandemic IT staff disruptions. SUSE Manager 4.1 can also be used with the Salt DevOps program. Its vertical-market brother, SUSE Manager for Retail 4.1, is optimized and tailored specifically for retail. This release comes with enhancements for small store operations, enhanced offline capabilities and image management over Wi-Fi, and enhanced virtual machine management and monitoring capabilities. Simultaneously it can scale retail environments to tens of thousands of end-point devices and help modernize point-of-service rollouts.

Slashdot Top Deals