×
Cloud

Report: Google Plans to Relinquish Control of Open-Source Project Knative (siliconangle.com) 7

"Google LLC is reportedly planning to relinquish direct control over its open-source Knative project to a five-seat steering committee that will have rules to prevent any single organization from having more than two seats," reports SiliconANGLE.

"The plan is designed to stymie criticism that Google is secretly planning to retain control over key open-source projects it has developed, according to a report today on the tech news website The Protocol." Knative is an open-source project first developed by Google that provides components for deploying, running and managing serverless, cloud-native applications on top of Kubernetes, a container management platform that was also built by Google and open-sourced in 2015... Google is planning to make some major changes to Knative's governance structure, according to the report. Seats on the committee will now be held by individuals rather than specific companies, and elections will be held later this year to select two new members. In addition, the report said, Google is considering eventually expanding the committee to seven members as a way to include representatives from Knative's user community.

The plan comes just a few months after Google angered some members of the open-source software community when it reneged on a promise to hand over control of another project, Istio, to the Cloud Native Computing Foundation, a Linux Foundation project that was founded in 2015 to help advance container technology. In July Google said that it instead of transferring Istio to the CNCF, it would create a neutral organization called Open Usage Commons to manage its trademark policies, while control would be maintained by the project's steering committee.

That decision upset many of Google's partners, most notably IBM Corp., which has also contributed greatly to the development of Istio.

Cloud

IBM To Split Into Two Companies By End of 2021 (arstechnica.com) 88

IBM announced this morning that the company would be spinning off some of its lower-margin lines of business into a new company and focusing on higher-margin cloud services. Ars Technica reports: During an investor call, CEO Arvind Krishna acknowledged that the move was a "significant shift" in how IBM will work, but he positioned it as the latest in a decades-long series of strategic divestments. "We divested networking back in the '90s, we divested PCs back in the 2000s, we divested semiconductors about five years ago because all of them didn't necessarily play into the integrated value proposition," he said. Krishna became CEO in April 2020, replacing former CEO Ginni Rometty (who is now IBM's executive chairman), but the spin-off is the capstone of a multi-year effort to apply some kind of focus to the company's sprawling business model.

The new spin-off doesn't have a formal name yet and is referred to as "NewCo" in IBM's marketing and investor relations material. Under the spin-off plan, the press release claims IBM "will focus on its open hybrid cloud platform, which represents a $1 trillion market opportunity," while NewCo "will immediately be the world's leading managed infrastructure services provider." (This is because NewCo will start life owning the entirety of IBM Global Technology Services' existing managed infrastructure clients, which means about 4,600 accounts, including about 75 percent of the Fortune 100.)

See also: Cringely Predicts IBM 'Disappears Into Red Hat'
AI

Many Top AI Researchers Get Financial Backing From Big Tech (wired.com) 10

A study finds that 58 percent of faculty at four prominent universities have received grants, fellowships, or other financial support from 14 tech firms. From a report: A paper published in July by researchers from the University of Rochester and China's Cheung Kong Graduate School of Business found that Google, DeepMind, Amazon, and Microsoft hired 52 tenure-track professors between 2004 and 2018. It concluded that this "brain drain" has coincided with a drop in the number of students starting AI companies. The growing reach and power of Big Tech prompted Abdalla to question how it influences his field in more subtle ways. Together with his brother, also a graduate student, Abdalla looked at how many AI researchers at Stanford, MIT, UC Berkeley, and the University of Toronto have received funding from Big Tech over their careers.

The Abdallas examined the CVs of 135 computer science faculty who work on AI at the four schools, looking for indications that the researcher had received funding from one or more tech companies. For 52 of those, they couldn't make a determination. Of the remaining 83 faculty, they found that 48, or 58 percent, had received funding such as a grant or a fellowship from one of 14 large technology companies: Alphabet, Amazon, Facebook, Microsoft, Apple, Nvidia, Intel, IBM, Huawei, Samsung, Uber, Alibaba, Element AI, or OpenAI. Among a smaller group of faculty that works on AI ethics, they also found that 58 percent of those had been funded by Big Tech. When any source of funding was included, including dual appointments, internships, and sabbaticals, 32 out of 33, or 97 percent, had financial ties to tech companies. "There are very few people that don't have some sort of connection to Big Tech," Abdalla says.

The Internet

Are We Headed For 200 Separate Nationally-Controlled Internets? (thehill.com) 80

Roger Cochetti directed internet public policy for IBM from 1994 through 2000 and later served as Senior Vice-President & Chief Policy Officer for VeriSign and Group Policy Director for CompTIA. This week he warned about signs "that the once open, global internet is slowly being replaced by 200, nationally-controlled, separate internets." And, while these separate American, Chinese, Russian, Australian, European, British, and other "internets" may decide to have some things in common with each other, the laws of political gravity will slowly pull them further apart as interest groups in each country lobby for their own concerns within their own country. Moreover, we will probably see the emergence of a global alternate internet before long...

As background, it's important to recognize that — by almost any measure — the global internet is controlled by businesses and non-profits subject to the jurisdiction of the United States government. Within a roughly 1,000-mile strip of land stretching from San Diego to Seattle lie most major internet businesses and network control or standards bodies (and those that aren't there likely lie elsewhere in the United States). So — as the governments of China, Russia and Iran never tire of explaining — while Americans constitute around 310 million out of the world's 4.3 billion internet users (around 8%), the U.S. government exercises influence or control over more than 70% of the internet's controls and services... China's ability to control the internet experience within its bordersx` between roughly 2005 and 2018 taught many other countries that doing so, even if costly, is possible. This lesson was not lost on Russia, Iran, Australia, Turkey, Saudi Arabia, the EU and many other countries, which began developing legal (and sometimes technical) means to control internet content within their borders. This legal/technical nationalization over the past decade was significantly boosted by the realization that it was actually not very difficult for a government to substantially shut down the internet within a territory...

The first major step in the introduction of a new, China-centric internet may have taken place last year when China introduced to the UN's International Telecommunications Union a proposal for a new type of protocol that would connect networks in a way comparable to, but different from, the way that the internet protocols have done. This was quickly dubbed China's New IP, and it has been the subject of major controversy as the nations and companies decide how to react. Whether a new Chinese-centric internet is based on a new series of protocols or is simply based on a new set of internet domain names and numbers, it seems likely that this alternate internet will give national governments quite a bit more control over what happens within their territories than does the global, open internet. This feature will attract quite a few national governments to join in — not least Russia, Iran and perhaps Turkey and India.

The combined market power of those participating countries would make it difficult for any global internet business to avoid such a new medium. The likely result being two, parallel global computer inter-networking systems... which is pretty much what Google CEO Eric Schmidt predicted.

Government

Singapore Becomes First Country To Use Facial Verification For a National ID Service (bbc.com) 18

"Singapore will be the first country in the world to use facial verification in its national identity scheme," reports the BBC: The biometric check will give Singaporeans secure access to both private and government services. The government's technology agency says it will be "fundamental" to the country's digital economy. It has been trialled with a bank and is now being rolled out nationwide. It not only identifies a person but ensures they are genuinely present. "You have to make sure that the person is genuinely present when they authenticate, that you're not looking at a photograph or a video or a replayed recording or a deepfake," said Andrew Bud, founder and chief executive of iProov, the UK company that is providing the technology...

"Face recognition has all sorts of social implications. Face verification is extremely benign," said Mr Bud. Privacy advocates, however, contend that consent is a low threshold when dealing with sensitive biometric data. "Consent does not work when there is an imbalance of power between controllers and data subjects, such as the one observed in citizen-state relationships," said Ioannis Kouvakas, legal officer with London-based Privacy International....

GovTech Singapore thinks the technology will be good for businesses, because they can use it without having to build the infrastructure themselves. Additionally, Kwok Quek Sin, senior director of national digital identity at GovTech Singapore, said it is better for privacy because companies won't need to collect any biometric data. In fact, they would only see a score indicating how close the scan is to the image the government has on file.

In 1993 William Gibson called Singapore "Disneyland with the death penalty... a relentlessly G-rated experience, micromanaged by a state that has the look and feel of a very large corporation. If IBM had ever bothered to actually possess a physical country, that country might have had a lot in common with Singapore."
Australia

Chinese Intelligence Compiles 'Vast Database' About Millions Around the World (abc.net.au) 75

Australia's national public broadcaster ABC reports: A Chinese company with links to Beijing's military and intelligence networks has been amassing a vast database of detailed personal information on thousands of Australians, including prominent and influential figures. A database of 2.4 million people, including more than 35,000 Australians, has been leaked from the Shenzhen company Zhenhua Data which is believed to be used by China's intelligence service, the Ministry of State Security. Zhenhua has the People's Liberation Army and the Chinese Communist Party among its main clients.

Information collected includes dates of birth, addresses, marital status, along with photographs, political associations, relatives and social media IDs. It collates Twitter, Facebook, LinkedIn, Instagram and even TikTok accounts, as well as news stories, criminal records and corporate misdemeanours. While much of the information has been "scraped," some profiles have information which appears to have been sourced from confidential bank records, job applications and psychological profiles.

The company is believed to have sourced some of its information from the so-called "dark web". One intelligence analyst said the database was "Cambridge Analytica on steroids", referring to the trove of personal information sourced from Facebook profiles in the lead up to the 2016 US election campaign. But this data dump goes much further, suggesting a complex global operation using artificial intelligence to trawl publicly available data to create intricate profiles of individuals and organisations, potentially probing for compromise opportunities.

Zhenhua Data's chief executive Wang Xuefeng, a former IBM employee, has used Chinese social media app WeChat to endorse waging "hybrid warfare" through manipulation of public opinion and "psychological warfare"....

The database was leaked to a US academic, who worked with Canberra cyber security company Internet 2.0 and "was able to restore 10 per cent of the 2.4 million records for individuals...

"Of the 250,000 records recovered, there are 52,000 on Americans, 35,000 Australians, 10,000 Indian, 9,700 British, 5,000 Canadians, 2,100 Indonesians, 1,400 Malaysia and 138 from Papua New Guinea."
Java

Oracle's Plan to Keep Java Developers from Leaving for Rust and Kotlin (zdnet.com) 90

ZDNet reports: Oracle has released version 15 of Java, the language created 25 years ago by James Gosling at Sun Microsystems, which Oracle snapped up in 2009 for about $7.4bn to gain what it said was the "most important software Oracle has ever acquired". Java 15, or Oracle Java Development Kit (JDK) 15, brings the Edwards-Curve digital signature algorithm, hidden classes, and former preview features that have been finalized, including text blocks, and the Z Garbage Collector, while the sealed-classes feature arrives and pattern matching and records emerge as a second preview...

In July, Java fell out of RedMonk's top two positions for the first time since 2012 and now resides behind JavaScript and Python in terms of popularity. Tiobe in September ranked Java in second position, behind C and ahead of Python.... But Java is still hugely popular and widely used in the enterprise, according to Oracle, which notes it is used by over 69% of full-time developers worldwide... It counts Arm, Amazon, IBM, Intel, NTT Data, Red Hat, SAP and Tencent among its list of notable contributors to JDK 15. Oracle also gave a special mention to Microsoft and cloud system monitoring service DataDog for fixes...

As part of Java's 25th anniversary, Oracle commissioned analyst firm Omdia to assess its six-month release strategy for Java and whether it would be enough to keep millions of Java developers away from memory-safe alternatives such as Kotlin, the language Google has endorsed for Android development, and Rust, a system programming language that was created at Mozilla. "In Omdia's opinion, the work Oracle began a few years ago in moving to a six-month update cycle and introducing a new level of modularity, puts the vendor in good stead with its constituency of approximately 12 million developers," Oracle said in its report on Omdia's analysis.

"However, Oracle and the Java programming language need an ongoing series of innovative, must-have, and 'delightful' features that make the language even more user friendly and cloud capable. These will keep existing Java developers happy while steering potential Java developers away from newer languages like Rust and Kotlin."

IBM

IBM Publishes its Quantum Roadmap, Says it Will Have a 1,000-qubit Machine in 2023 (techcrunch.com) 35

IBM today, for the first time, published its road map for the future of its quantum computing hardware. There is a lot to digest here, but the most important news in the short term is that the company believes it is on its way to building a quantum processor with more than 1,000 qubits -- and somewhere between 10 and 50 logical qubits -- by the end of 2023. From a report: Currently, the company's quantum processors top out at 65 qubits. It plans to launch a 127-qubit processor next year and a 433-qubit machine in 2022. To get to this point, IBM is also building a completely new dilution refrigerator to house these larger chips, as well as the technology to connect multiple of these units to build a system akin to today's multi-core architectures in classical chips. IBM's Dario Gil tells me that the company made a deliberate choice in announcing this road map and he likened it to the birth of the semiconductor industry.

"If you look at the difference of what it takes to build an industry as opposed to doing a project or doing scientific experiments and moving a field forward, we have had a philosophy that what we needed to do is to build a team that did three things well, in terms of cultures that have to come together. And that was a culture of science, a culture of the road map, and a culture of agile," Gil said. He argues that to reach the ultimate goal of the quantum industry, that is, to build a large-scale, fault-tolerant quantum computer, the company could've taken two different paths. The first would be more like the Apollo program, where everybody comes together, works on a problem for a decade and then all the different pieces come together for this one breakthrough moment.

IBM

IBM Will Feed Four Children For a Day For Every Student Who Masters the Mainframe (ibm.com) 151

This week brings a special event honoring the IBM Z line of mainframes, writes long-time Slashdot reader theodp: As part of this week's IBM Z Day event, looking-for-young-blood IBM is teaming up with tech-backed K-12 CS nonprofits Code.org and CSforALL and calling on students 14-and-up to Master The Mainframe during the 24-hour code-a-thon to open doors to new opportunities with Fortune 500 companies.

"The rewards for participants are substantial," explains Big Blue. "For every student who finishes Level 1, IBM will donate to the UN World Food Programme #ShareTheMeal... In celebration of IBM Z day, we will double the donation for all students that complete Master the Mainframe Level 1 between Sept 15 — 30 2020. Just 1 hour of your time will feed 4 children for a day."
"Through three interactive Levels, you will access a mainframe and get skilled up on the foundations of Mainframe," according to IBM's announcement at MasterTheMainframe.com, "including JCL, Ansible, Python, Unix, COBOL, REXX, all through VS Code. Round it all out with a grand challenge where you craft your own fully-equipped Mainframe creation."

"One mainframe is equivalent to 1,500 x86 servers," the site notes. It also points out that mainframes handle 30 billion transactions every day, "more than the number of Google searches every day" — including 87% of all credit card transactions, nearly $8 trillion payments a year.
IBM

The US Equal Employment Opportunity Commission Confirms a Pattern of Age Discrimination at IBM (propublica.org) 62

The U.S. Equal Employment Opportunity Commission has issued a sweeping decision concluding that IBM engaged in systematic age discrimination between 2013 and 2018, when it shed thousands of older workers in the United States. ProPublica reports: The EEOC finding, contained in an Aug. 31 letter to a group of ex-employees, comes more than two years after ProPublica reported that the company regularly flouted or outflanked laws intended to protect older workers from bias in hiring and firing. The letter says a nationwide EEOC investigation "uncovered top-down messaging from (IBM's) highest ranks directing managers to engage in an aggressive approach to significantly reduce the headcount of older workers to make room for" younger ones.

Employment law experts said the agency's finding could result in IBM facing millions of dollars in settlement payments or a federal lawsuit at a time when the company is under continued competitive pressure and in the midst of a management changeover. Lawyers for some former workers said EEOC investigators have told them the agency decision may apply to more than 6,000 ex-IBM employees, a number that could grow considerably if, as experts say is likely, the agency's finding prompts new, private age discrimination lawsuits.

Wireless Networking

Smart Dust Is Coming. Are You Ready? (forbes.com) 101

"Imagine a world where wireless devices are as small as a grain of salt," writes futurist Bernard Marr in Forbes, describing a technology being researched by companies like IBM, General Electric, and Cisco. "These miniaturized devices have sensors, cameras and communication mechanisms to transmit the data they collect back to a base in order to process.

"Today, you no longer have to imagine it: microelectromechanical systems (MEMS), often called motes, are real and they very well could be coming to a neighborhood near you. Whether this fact excites or strikes fear in you it's good to know what it's all about." Outfitted with miniature sensors, MEMS can detect everything from light to vibrations to temperature. With an incredible amount of power packed into its small size, MEMS combine sensing, an autonomous power supply, computing and wireless communication in a space that is typically only a few millimeters in volume. With such a small size, these devices can stay suspended in an environment just like a particle of dust. They can:

- Collect data including acceleration, stress, pressure, humidity, sound and more from sensors

- Process the data with what amounts to an onboard computer system

- Store the data in memory

- Wirelessly communicate the data to the cloud, a base or other MEMs

Since the components that make up these devices are 3D printed as one piece on a commercially available 3D printer, an incredible amount of complexity can be handled and some previous manufacturing barriers that restricted how small you can make things were overcome. The optical lenses that are created for these miniaturized sensors can achieve the finest quality images.

The potential of smart dust to collect information about any environment in incredible detail could impact plenty of things in a variety of industries from safety to compliance to productivity. It's like multiplying the internet of things technology millions or billions of times over.

Google

To Assuage Fears of Google Domination, Istio Restructures Its Steering Committee (thenewstack.io) 10

An anonymous reader quotes The New Stack: While there are some who may never get over the fact that the Istio service mesh, originally created by Google and IBM, will not be handed over to the Cloud Native Computing Foundation, the project took a big step this past week to assuage those who critiqued the project for being under a Google-majority control: Istio has introduced a new Istio steering committee.

According to the blog post, the new steering committee will consist of 13 seats, with four "elected Community Seats" and nine "proportionally allocated Contribution Seats," a change they say "solidifies our commitment to open governance, ensuring that the community around the project will always be able to steer its direction, and that no one company has majority voting control over the project." This final point is really the key to the announcement here, with them further and more explicitly clarifying later that "no single vendor, no matter how large their contribution, has majority voting control over the Istio project." To this end, they write, they have "implemented a cap on the number of seats a company can hold, such that they can neither unanimously win a vote, or veto a decision of the rest of the committee."

As for how those seats are allocated, the four Community Seats will consist of four representatives from four different organizations and will be chosen in an annual election. The nine Contribution Seats will be assigned to a minimum of three different companies "in proportion to contributions made to Istio in the previous 12 months," with this year's metric being merged pull requests.

But not everyone was satisfied. On Twitter AWS engineer Matthew S. Wilson called it "a crappy way to build a community," objecting to the way it's recognizing and rewarding open source contributions by company rather than by the individuals.

And Knative co-founder Matt Moore called it "what you get when a company wants to 'play community', but treat its employees as interchangeable cogs."
Open Source

'The Future of American Industry Depends On Open Source Tech' (wired.com) 45

An anonymous reader shares an opinion piece from Wired, written by Kevin Xu and Jordan Schneider. Xu is the author of Interconnected, investor and advisor of open source startups at OSS Capital, and served in the Obama White House. Schneider is the author of the ChinaTalk newsletter and host of the ChinaTalk podcast, posted on Lawfare. From the report: Open source is a technology development and distribution methodology, where the codebase and all development -- from setting a roadmap to building new features, fixing bugs, and writing documentation -- is done in public. A governing body (a group of hobbyists, a company, or a foundation) publicly manages this work, which is most often done in a public repository on either GitHub or GitLab. Open source has two important, and somewhat counterintuitive, advantages: speed and security. These practices lead to faster technological developments, because a built-in global community of developers help them mature, especially if the technology is solving a real problem. Top engineers also prefer to work with and on open source projects. Wrongly cast as secretive automatons, they are more often like artists, who prefer to learn, work, collaborate, and showcase what they've built in public, even when they are barely compensated for that work.

But doesn't keeping a technology's codebase open make it more vulnerable to attack? In fact, exposing the codebase publicly for security experts and hackers to easily access and test is the best way to keep the technology secure and build trust with end users for the long haul. Sunlight is the best disinfectant, and open source is that sunlight in technology. Linux, the operating system, and Kubernetes, the cloud container orchestration system, are two of the most prominent examples. [...] Using open source technology is now the fastest way new products get built and legacy technologies get replaced. Yet as US policymakers develop their industrial policy to compete with China, open source is conspicuously absent.

By leaning on the advantages of open source, policymakers can pursue an industrial policy to help the US compete in the 21st century in line with our broader values. The alternative is to continue a top-down process that picks winners and losers based on not just technology but also political influence, which only helps individual firms secure market share, not sparking innovation more broadly. A few billion more dollars won't save Intel from its technical woes, but a healthier ecosystem leveraging open source technology and community would put the US in a better position for the future. Open source technology allows for vendor-neutrality. Whether you're a country or a company, if you use open source, you're not locked in to another company's technical stack, roadmap, or licensing agreements. After Linux was first created in 1991, it was widely adopted by large companies like Dell and IBM as a vendor neutral alternative to Microsoft's Windows operating system. In the future, chip designers won't be locked into Intel or ARM with RISC-V. With OpenRAN, 5G network builders won't be forced to buy from Huawei, Nokia, or Ericsson. [...] By doubling down on open source, America not only can address some of our most pressing technological challenges faster and more securely, but also revive relationships with our allies and deepen productive collaborations with the tech sector.

Open Source

Remembering the Golden Age of Computer User Groups (arstechnica.com) 55

Slashdot reader #16,185 wrote regularly for the newsletter of a small-town computer users group. Now they've written an article for Ars Technica reminding readers that "The Homebrew Computer Club where the Apple I got its start is deservedly famous — but it's far from tech history's only community gathering centered on CPUs." Throughout the 70s and into the 90s, groups around the world helped hapless users figure out their computer systems, learn about technology trends, and discover the latest whiz-bang applications. And these groups didn't stick to Slacks, email threads, or forums; the meetings often happened in real life. But to my dismay, many young technically-inclined whippersnappers are completely unaware of computer user groups' existence and their importance in the personal computer's development. That's a damned shame... Computer groups celebrated the industry's fundamental values: a delight in technology's capabilities, a willingness to share knowledge, and a tacit understanding that we're all here to help one another...

Two things primarily made user groups disappear: first was the Internet — and the BBSes that preceded them. If you could articulate a question, you could find a website with the answer. But computers also became easier to use. Once personal computers went mainstream, troubleshooting them stopped being an esoteric endeavor.

The typical computer user group is gone now. For the exceptions, you can find an incomplete and mostly out-of-date list via the Association of PC User Groups, though online exploration may lead you to more options. For example, the Toronto PET Users Group (TPUG) is the longest continually operating Commodore user group. Washington Apple Pi is still going strong, as is the Triangle Linux Users Group. IBM's user group, SHARE, began in the 1950s and continues to support enterprise users, though it's primarily a conference these days...

Hopefully tech will continue to inspire ways to get together with other people who share your enthusiasm, whether it's Raspberry Pi meetups, Maker days, or open source conferences such as Drupalcon or PyCon. You also continue the computer user group ethos by finding ways to help other tech enthusiasts locally. For example, Hack Club aims to teach skills to high school students. Hack Clubs are already in two percent of US high schools across 35 states and 17 countries, with about 10,000 students attending clubs and hackathons each year.

So even if computer user groups largely are a thing of the past, their benefits live on. User groups were the precursor to the open source community, based on the values of sharing knowledge and helping one another. And who knows, without user groups promoting a cooperative viewpoint, the open source community might never have taken off like it did.

The article includes photographs of the OS/2 community's magazine Extended Attributes, the M.A.C.E. Journal (for Atari users), the Commodore Eight Bit Boosters newsletter, and the 1979 publication Prog/80 ("dedicated to the serious programmer.")

And it also includes video of a 1981 visit to the Boston Computer Society by a 25-year-old Bill Gates.
United Kingdom

Bletchley Park Museum To Layoff a Third of Its Staff (theguardian.com) 48

Long-time Slashdot reader simpz writes: The Guardian is reporting that Bletchley Park Museum is planning to make a third of it's staff redundant. This, of course, the museum of British wartime codebreakers, including famously Alan Turing.

I personally think Google, Amazon, Microsoft, IBM, Facebook, Twitter etc should all chip in to stop this from happening. Without Alan Turing and others, they wouldn't have a business, and to these companies this is small change.

Privacy

Weather Channel App To Change Practices After LA Lawsuit (apnews.com) 32

The operator of The Weather Channel mobile app has agreed to change how it informs users about its location-tracking practices and sale of personal data as part of a settlement with the Los Angeles city attorney's office, officials said Wednesday. From a report: City Attorney Mike Feuer alleged in a 2019 lawsuit that app users were misled when they agreed to share their location information in exchange for personalized forecasts and alerts. Instead, the lawsuit claimed users were unaware they had surrendered personal privacy when the company sold their data to third parties. Feuer announced the settlement Wednesday with the app's operator, TWC Product and Technology LLC, and owner IBM. The app's disclosure screens were initially revised after the lawsuit was filed and future changes that will be monitored by the city attorney's office are planned.

"Users will now clearly know that they have the choice to provide access to their locations," Feuer said at a news conference, adding he hopes other companies will follow the app's model for transparency. "It shows that we don't have to sacrifice our privacy for things of value." IBM bought the app along with the digital assets of The Weather Company in 2015 for $2 billion but did not acquire The Weather Channel seen on TV, which is owned by another company.

Programming

'Real' Programming Is an Elitist Myth (wired.com) 283

When people build a database to manage reading lists or feed their neighbors, that's coding -- and culture. From an essay: We are past the New York City Covid-19 peak. Things have started to reopen, but our neighborhood is in trouble, and people are hungry. There's a church that's opened space for a food pantry, a restaurant owner who has given herself to feeding the neighborhood, and lots of volunteers. [...] It's a complex data model. It involves date fields, text fields, integers, notes. You need lots of people to log in, but you need to protect private data too. You'd think their planning conversations would be about making lots of rice. But that is just a data point. The tool the mutual aid group has settled on to track everything is Airtable, a database-as-a-service program. You log in and there's your database. There are a host of tools like this now, "low-code" or "no-code" software with names like Zapier or Coda or Appy Pie. At first glance these tools look like flowcharts married to spreadsheets, but they're powerful ways to build little data-management apps. Airtable in particular keeps showing up everywhere for managing office supplies or scheduling appointments or tracking who at WIRED has their fingers on this column. The more features you use, the more they charge for it, and it can add up quickly. I know because I see the invoices at my company; we use it to track projects.

"Real" coders in my experience have often sneered at this kind of software, even back when it was just FileMaker and Microsoft Access managing the flower shop or tracking the cats at the animal shelter. It's not hard to see why. These tools are just databases with a form-making interface on top, and with no code in between. It reduces software development, in all its complexity and immense profitability, to a set of simple data types and form elements. You wouldn't build a banking system in it or a game. It lacks the features of big, grown-up databases like Oracle or IBM's Db2 or PostgreSQL. And since it is for amateurs, the end result ends up looking amateur. But it sure does work. I've noticed that when software lets nonprogrammers do programmer things, it makes the programmers nervous. Suddenly they stop smiling indulgently and start talking about what "real programming" is. This has been the history of the World Wide Web, for example. Go ahead and tweet "HTML is real programming," and watch programmers show up in your mentions to go, "As if." Except when you write a web page in HTML, you are creating a data model that will be interpreted by the browser. This is what programming is. Code culture can be solipsistic and exhausting. Programmers fight over semicolon placement and the right way to be object-oriented or functional or whatever else will let them feel in control and smarter and more economically safe, and always I want to shout back: Code isn't enough on its own. We throw code away when it runs out its clock; we migrate data to new databases, so as not to lose one precious bit. Code is a story we tell about data.

IBM

IBM Unveils Power10 Processor for Big Data Analytics and AI Workloads (venturebeat.com) 63

At the Hot Chips 2020 conference, which was held virtually this year, IBM announced the IBM Power10. From a report: It's the successor to the Power9 and represents the next generation of the company's processor family. IBM claims that the Power10 delivers up to three times greater efficiency than its predecessor while at the same time delivering higher workload capacity and container density. The Power10 was designed over five years and has the distinction of being IBM's first commercial 7-nanometer processor. (In 2015, IBM, Samsung, and other members of IBM's Research Alliance produced the first test chips as part of a $3 billion R&D investment.) There will be multiple configurations, and while the specifics aren't yet being disclosed, the maximum single-chip-module offering won't exceed 15 SMT8 cores and the dual-chip-module offering won't exceed 30 SMT8 cores, according to IBM distinguished engineer and Power10 architect William Starke.
Open Source

Red Hat, Google, Microsoft, GitHub, and Others Launch the Open Source Security Foundation (infoq.com) 20

InfoQ reports on a new security group that launched last week: Supported by The Linux Foundation, the Open Source Security Foundation (OpenSSF) aims to create a cross-industry forum for a collaborative effort to improve open source software security. The list of initial members includes Google, Microsoft, GitHub, IBM, Red Hat, and more.

"As open source has become more pervasive, its security has become a key consideration for building and maintaining critical infrastructure that supports mission-critical systems throughout our society. It is more important than ever that we bring the industry together in a collaborative and focused effort to advance the state of open source security. The world's technology infrastructure depends on it."

Microsoft CTO for Azure Mark Russinovich explained clearly why open source security must be a community effort:

"Open-source software is inherently community-driven and as such, there is no central authority responsible for quality and maintenance. [...] Open-source software is also vulnerable to attacks against the very nature of the community, such as attackers becoming maintainers of projects and introducing malware. Given the complexity and communal nature of open source software, building better security must also be a community-driven process."

Also joining the group are Intel, IBM, Uber, and VMWare, according to Foundation's inaugural announcement, which promises its governance and decisions "will be transparent, and any specifications and projects developed will be vendor agnostic."
Education

University of Michigan Study Advocates Ban of Facial Recognition in Schools (venturebeat.com) 18

University of Michigan researchers recently published a study showing facial recognition technology in schools has limited efficacy and presents a number of serious problems. From a report: The research was led by Shobita Parthasarathy, director of the university's Science, Technology, and Public Policy (STPP) program, and finds the technology isn't just ill-suited to security purposes, it can actively promote racial discrimination, normalize surveillance, and erode privacy while marginalizing gender nonconforming students. The study follows the New York legislature's passage of a moratorium on the use of facial recognition and other forms of biometric identification in schools until 2022. The bill, a response to the Lockport City School District launching a facial recognition system, was among the first in the nation to explicitly regulate or ban use of the technology in schools. That development came after companies including Amazon, IBM, and Microsoft halted or ended the sale of facial recognition products in response to the first wave of Black Lives Matter protests in the U.S.

Slashdot Top Deals