×
IT

The Workforce Is About to Change Dramatically (theatlantic.com) 106

"For the first time ever, the world's largest companies are telling hundreds of thousands of workers to stay away from the office for a full year, or longer," notes the Atlantic.

"If, in five years, these edicts have no lingering effects on office culture, that would be awfully strange..." Ambitious engineers, media makers, marketers, PR people, and others may be more inclined to strike out on their own, in part because they will, at some point, look around at their living room and realize: I am alone, and I might as well monetize the fact of my independence. A new era of entrepreneurship may be born in America, supercharged by a dash of social-existential angst.

Or, you know, maybe not. If companies find that remote work is a mess, they might decide to prematurely scrap the experiment, like IBM and Yahoo famously did. It is certainly curious that the most prestigious tech companies now proclaiming the future of working from home were, just seven months ago, outfitting their offices with the finest sushi bars, yoga rooms, and massage rooms...

Nothing is certain, and every new trend incurs a backlash. Telepresence could crush some downtown businesses; but cheaper downtown real estate could also lead to a resurgence in interesting new restaurants. Working from home could lead to more free-agent entrepreneurship; but if companies notice that they're bleeding talent, they'll haul their workforces back to headquarters. Still, even a moderate increase in remote work could lead to fundamental changes in our labor force, economy, and politics. Remote workers will spend more money and time inside their houses; they will spend more time with online communities than with colleagues; and many will distribute themselves across the country, rather than feel it necessary to cluster near semi-optional headquarters.

IBM

Frances E. Allen, the First Woman To Win the Turing Award, Dies At 88 (ibm.com) 19

Frances "Fran" Allen, a pioneer in the world of computing, the first female IBM Fellow and the first woman to win the Turing Award, died on August 4, 2020, the day of her 88th birthday. IBM writes in a blog post remembering Allen: As a pioneer in compiler organization and optimization algorithms, Fran made seminal contributions to the world of computing. Her work on inter-procedural analysis and automatic parallelization continues to be on the leading edge of compiler research. She successfully reduced this science to practice through the transfer of this technology to products such as the STRETCH HARVEST Compiler, the COBOL Compiler, and the Parallel FORTRAN Product. As much as Fran will be remembered for her technical vision and her foundational work in computing, she will equally be remembered for her passion to inspire and mentor others, fostering an environment of perseverance and hard work throughout the IBM community.

Starting as a programmer, Fran's first assignment at IBM was to teach the research community FORTRAN, a new complex language IBM had announced just three months before. This was the start of Fran's career-long focus on compilers for high-performance computing. Following FORTRAN, Fran became one of three designers for IBM's Stretch-Harvest project in the late 1950's and early 1960's. As the language liaison with IBM's client, the National Security Agency (NSA), Fran helped design and build Alpha, a very high-level code breaking language which featured the ability to create new alphabets beyond the system defined alphabets.

An Experimental Compiler for IBM's Advanced Computing System (ACS) became her next project. Fran designed and built the machine-independent, language-independent optimizing component of the compiler. The result was a tool to help drive the hardware design and a new way to analyze and transform programs. This work led to Fran's seminal paper on Program Optimization, first published in 1966, describing a robust new framework for implementing program analysis and optimization as well as a powerful set of new algorithms. Fran's 1970 paper on Control Flow analysis introduced the notion of "intervals" and node dominance relations, important improvements over the control flow abstractions given in her earlier paper. Her 1972 paper, "A Catalog of Optimizing Transformations," identified and discussed many of the transformations commonly used today.

United States

White House Unveils Partnership To Boost Quantum Science Education (reuters.com) 67

The White House Office of Science and Technology Policy said on Wednesday the Trump administration is launching a national education partnership to expand access to K-12 quantum information science (QIS) education with major companies and research institutions. From a report: The public-private initiative with the National Science Foundation includes Amazon's Amazon Web Services, Boeing, Alphabet's Google, IBM Corp, Lockheed Martin, Microsoft, the University of Illinois and University of Chicago. The National Science Foundation is also awarding $1 million to QIS education. The initiative is designed in part to help introduce students to quantum information themes before college.
Data Storage

Researchers Use DNA to Store 'The Wizard of Oz' - Translated Into Esperanto (popularmechanics.com) 74

"DNA is millions of times more efficient at storing data than your laptop's magnetic hard drive," reports Popular Mechanics.

"Since DNA can store data far more densely than silicon, you could squeeze all of the data in the world inside just a few grams of it." In a new paper published this week in the journal Proceedings of the National Academy of Sciences, Ilya Finkelstein, an associate professor of molecular biosciences at the University of Texas at Austin and company detail their new error correction method... They were able to store the entirety of The Wizard of Oz, translated into Esperanto, with more accuracy than prior DNA storage methods ever could have. We're on the yellow brick road toward the future of data storage.

Researchers at the University of Texas at Austin are certainly not the first to have encoded a work of art onto strands of DNA... [A] team of researchers from Microsoft and the University of Washington fit 200 megabytes of data onto lengths of DNA, including the entirety of War and Peace. In March 2019, they even came up with the first automated system for storing and retrieving data in the manufactured genetic material. Today, other major technology firms are also working in the space, including both IBM and Google. The ultra-secretive U.S. Intelligence Advanced Research Projects Activity — the government's version of DARPA, but for spies — is even invested in the work. These researchers envision a future where some of the most precious, but rarely accessed data, can be stored in vials of DNA, only pulled down from the cool, dark storage of the lab, as needed....

Because there are four building blocks in DNA, rather than the binary 1s and 0s in magnetic hard drives, the genetic storage method is far more dense, explains John Hawkins, another co-author of the new paper. "A teaspoon of DNA contains so much data it would require about 10 Walmart Supercenter-sized data centers to store using current technology," he tells Popular Mechanics. "Or, as some people like to put it, you could fit the entire internet in a shoe box." Not only that, but DNA is future-proof. Hawkins recalls when CDs were the dominant storage method, back in the 1990s, and they held the promise that their storage could last forever, because plastic does (but scratches can be devastating). Data stored on DNA, on the other hand, can last for hundreds of thousands of years. In fact, there is a whole field of science called archaeogenetics that explores the longevity of DNA to understand the ancient past... DNA storage doesn't require any energy, either — just a cool, dark place to hang out until someone decides to access it. But the greatest advantage, Hawkins says, is that our ability to read and write DNA will never become obsolete....

But like all data storage methods, DNA has a few shortcomings as well. The most significant upfront hurdle is cost. Hawkins says that current methods are similar to the cost for an Apple Hard Disk 20 back in 1980. Back then, about 20 megabytes of storage — or the amount of data you'd need to use to download a 15-minute video — went for about $1,500.

SuSE

SUSE Releases Major Linux Update (zdnet.com) 27

SUSE has released the next versions of its flagship operating system, SUSE Linux Enterprise (SLE) 15 Service Pack 2 and its latest infrastructure management program, SUSE Manager 4.1. ZDNet reports: SLE 15 SP2 is available on the x86-64, Arm, IBM POWER, IBM Z, and LinuxONE hardware architectures. This new Linux server edition is based on the Linux 5.3 kernel. This new kernel release includes upstream features such as utilization clamping support in the task scheduler, and power-efficient userspace waiting. Other new and noteworthy features include:

- Support for migration from openSUSE Leap to SUSE Linux Enterprise Server (SLES). With this, you can try the free, community openSUSE Linux distro, and then, if you find it's a good choice for your business, upgrade to SLES.
- Extended Package Search. By using the new Zypper, SUSE's command line package manager, command option -- zypper search-packages -- sysadmins can now search across all SUSE repositories, even unenabled ones. This makes it easier for administrators to find required software packages.
- SLE Software Development Kit (SDK) is now integrated into SLE. Development packages are packaged alongside regular packages. - Python 3: SLE 15 offers full support for Python 3 development. SLE still supports Python 2 for the time being.
- 389 Directory Server replaces OpenLDAP as the LDAP directory service.
- Repository Mirroring Tool (RMT) replaces Subscription Management Tool (SMT). RMT allows mirroring SUSE repositories and custom repositories. You can then register systems directly with RMT. In environments with tightened security, RMT can also proxy other RMT servers.
- Better business continuity with improved SLE Live Patching. SUSE claims Live Patching increases system uptime by up to 12 months. SLE Live Patching is also now available for IBM Z and LinuxONE mainframe architectures.

As for SUSE Manager 4.1, this is an improved open-source infrastructure management and automation solution that lowers costs, identifies risk, enhances availability, and reduces complexity in edge, cloud, and data center environments. With SUSE Manager you can keep servers, VMs, containers, and clusters secure, healthy, compliant, and low maintenance whether in private, public, or hybrid cloud. That's especially important these days thanks to coronavirus pandemic IT staff disruptions. SUSE Manager 4.1 can also be used with the Salt DevOps program. Its vertical-market brother, SUSE Manager for Retail 4.1, is optimized and tailored specifically for retail. This release comes with enhancements for small store operations, enhanced offline capabilities and image management over Wi-Fi, and enhanced virtual machine management and monitoring capabilities. Simultaneously it can scale retail environments to tens of thousands of end-point devices and help modernize point-of-service rollouts.

Open Source

Source Code of Covid Tracker Ireland App Goes Global With Linux Project (siliconrepublic.com) 30

The Linux Foundation Public Health initiative has chosen the Covid Tracker Ireland app as one of its first two open-source Covid-19 projects. From a report: Since its launch, more than 1.3m people have downloaded the Covid Tracker Ireland app, which was developed to help track the future spread of the coronavirus. Now, the app has been chosen as one of the first two open-source contact-tracing projects by the newly established Linux Foundation Public Health (LFPH) initiative. Nearform, the Waterford-based company that developed the app with the HSE, has been made one of the initiative's seven premium members, along with Cisco, Doc.ai, Geometer, IBM, Tencent and VMware. Under the project name 'Covid Green', the source code of the Irish app is being made available for other public health authorities and their developers across the world to use and customise. As part of the agreement, Nearform will manage the source code repository on GitHub. In its announcement, the LFPH pointed to the "extraordinarily high" adoption rate of the Covid Tracker Ireland app.
Open Source

Open Source Proponents React to Google's 'Open Usage Commons' (diginomica.com) 11

Google's announcement of a new open source initiative called the Open Usage Commons "caused some consternation among other open source proponents," according to Diginomica: IBM's reaction is typical. In a statement, the company said that "the creation of the Open Usage Commons (OUC) is disappointing because it doesn't live up to the community's expectation for open governance... Without this vendor-neutral approach to project governance, there will be friction within the community of Kubernetes-related projects...."

Google's underlying reason was that the management of trademarks was an area for legal specialists — something beyond the competence of open source project maintainers. According to Google, the new initiative would address this knowledge gap. "The Open Usage Commons is therefore dedicated to creating a model where everyone in the open source chain — from project maintainers to downstream users to ecosystem companies — has peace of mind around trademark usage and management. The projects in the Open Usage Commons will receive support specific to trademark protection and management, usage guidelines, and conformance testing...."

The Linux Foundation's response... "When trademarks of an open source project are owned by a single company within a community, there is an imbalance of control... The reservation of this exclusive right to exercise such control necessarily undermines the level playing field that is the basis for open governance. This is especially the case where the trademark is used in association with commercial products or solutions."

Red Monk analyst James Governor says that while Google's actions can be seen as provocative, it has gone down an interesting route. "The CNCF community is perhaps justifiably upset given expectations about Google's direction of travel for open source projects, but the creation of a trademark commons is an interesting one. We shall have to see how this plays out. There could be a fork ahead, or it all might be a storm in a teacup."

Security

Iranian Spies Accidentally Leaked Videos of Themselves Hacking (wired.com) 41

An anonymous reader quotes a report from Wired: Researchers at IBM's X-Force security team revealed today that they've obtained roughly five hours of video footage that appears to have been recorded directly from the screens of hackers working for a group IBM calls ITG18, and which other security firms refer to as APT35 or Charming Kitten. It's one of the most active state-sponsored espionage teams linked to the government of Iran. The leaked videos were found among 40 gigabytes of data that the hackers had apparently stolen from victim accounts, including U.S. and Greek military personnel. Other clues in the data suggest that the hackers targeted U.S. State Department staff and an unnamed Iranian-American philanthropist.

The IBM researchers say they found the videos exposed due to a misconfiguration of security settings on a virtual private cloud server they'd observed in previous APT35 activity. The files were all uploaded to the exposed server over a few days in May, just as IBM was monitoring the machine. The videos appear to be training demonstrations the Iran-backed hackers made to show junior team members how to handle hacked accounts. They show the hackers accessing compromised Gmail and Yahoo Mail accounts to download their contents, as well as exfiltrating other Google-hosted data from victims. This sort of data exfiltration and management of hacked accounts is hardly sophisticated hacking. It's more the kind of labor-intensive but relatively simple work that's necessary in a large-scale phishing operation. But the videos nonetheless represent a rare artifact, showing a first-hand view of state-sponsored cyberspying that's almost never seen outside of an intelligence agency.

IBM

IBM Job Ad Calls For a Minimum 12 Years' Experience With Kubernetes -- Which is Six Years Old (theregister.com) 42

IBM's Global Technology Services has posted a job ad calling for candidates with a "minimum 12+ years' experience in Kubernetes administration and management." From a report: Which is a little odd because the first GitHub commit for the project was made on June 7, 2014. And the feature freeze for version 1.0 was announced on May 22, 2015. Sharp-minded Reg readers will have recognised that -- absent time travel -- it is therefore not possible for anyone to have 12 years' experience with Kubernetes. The ad is sadly silent on just how IBM expects candidates will have found the time to accumulate a dozen years' experience in a six-year-old project.
Supercomputing

A Volunteer Supercomputer Team is Hunting for Covid Clues (defenseone.com) 91

The world's fastest computer is now part of "a vast supercomputer-powered search for new findings pertaining to the novel coronavirus' spread" and "how to effectively treat and mitigate it," according to an emerging tech journalist at Nextgov.

It's part of a consortium currently facilitating over 65 active research projects, for which "Dozens of national and international members are volunteering free compute time...providing at least 485 petaflops of capacity and steadily growing, to more rapidly generate new solutions against COVID-19."

"What started as a simple concept has grown to span three continents with over 40 supercomputer providers," Dario Gil, director of IBM Research and consortium co-chair, told Nextgov last week. "In the face of a global pandemic like COVID-19, hopefully a once-in-a-lifetime event, the speed at which researchers can drive discovery is a critical factor in the search for a cure and it is essential that we combine forces...."

[I]ts resources have been used to sort through billions of molecules to identify promising compounds that can be manufactured quickly and tested for potency to target the novel coronavirus, produce large data sets to study variations in patient responses, perform airflow simulations on a new device that will allow doctors to use one ventilator to support multiple patients — and more. The complex systems are powering calculations, simulations and results in a matter of days that several scientists have noted would take a matter of months on traditional computers.

The Undersecretary for Science at America's Energy Department said "What's really interesting about this from an organizational point of view is that it's basically a volunteer organization."

The article identifies some of the notable participants:
  • IBM was part of the joint launch with America's Office of Science and Technology Policy and its Energy Department.
  • The chief of NASA's Advanced Supercomputing says they're "making the full reserve portion of NASA supercomputing resources available to researchers working on the COVID-19 response, along with providing our expertise and support to port and run their applications on NASA systems."
  • Amazon Web Services "saw a clear opportunity to bring the benefits of cloud... to bear in the race for treatments and a vaccine," according to a company executive.
  • Japan's Fugaku — "which surpassed leading U.S. machines on the Top 500 list of global supercomputers in late June" — also joined the consortium in June.

Other consortium members:

  • Google Cloud
  • Microsoft
  • Massachusetts Institute of Technology
  • Rensselaer Polytechnic Institute
  • The National Science Foundation
  • Argonne, Lawrence Livermore, Los Alamos, Oak Ridge and Sandia National laboratories.
  • National Center for Atmospheric Research's Wyoming Supercomputing Center
  • AMD
  • NVIDIA
  • Dell Technologies. ("The company is now donating cycles from the Zenith supercomputer and other resources.")

The Military

Thousands of Contracts Highlight Quiet Ties Between Big Tech and US Military (nbcnews.com) 42

Over the past two years, thousands of tech company employees have taken a stand: they do not want their labor and technical expertise to be used for projects with the military or law enforcement agencies. Knowledge of such contracts, however, hasn't been easy for tech workers to come by. From a report: On Wednesday, newly published research from the technology accountability nonprofit Tech Inquiry revealed that the Department of Defense and federal law enforcement agencies including Immigration and Customs Enforcement, the FBI, the Drug Enforcement Agency and the Federal Bureau of Prisons, have secured thousands of deals with Google, Amazon, Microsoft, Dell, IBM, Hewlett Packard and even Facebook that have not been previously reported. The report offers a new window into the relationship between tech companies and the U.S. government, as well as an important detail about why such contracts are often difficult to find.

Tech Inquiry's research was led by Jack Poulson, a former Google research scientist who quit the company in 2018 after months of internal campaigning to get clarity about plans to deploy a censored version of its search engine in China called Project Dragonfly. Poulson has publicly opposed collaborations between American technology companies and the U.S. and foreign governments that aid in efforts to track immigrants, dissenters, and bolster military activity. Poulson analyzed more than 30 million government contracts signed or modified in the past five years. The Department of Defense and federal law enforcement agencies accounted for the largest share of those contracts, with tech companies accounting for a fraction of the total number of contracts.

Data Storage

400 TB Storage Drives In Our Future: Fujifilm (anandtech.com) 51

One of the two leading manufacturers of tape cartridge storage, FujiFilm, claims that they have a technology roadmap through to 2030 which builds on the current magnetic tape paradigm to enable 400 TB per tape. AnandTech reports: As reported by Chris Mellor of Blocks and Files, Fujifilm points to using Strontium Ferrite grains in order to enable an areal data density on tape of 224 Gbit-per-square-inch, which would enable 400 TB drives. IBM and Sony have already demonstrated 201 Gbit-per-square-inch technology in 2017, with a potential release of the technology for high volume production in 2026. Current drives are over an order of magnitude smaller, at 8 Gbit-per-square-inch, however the delay between research and mass production is quite significant.

Strontium Ferrite would replace Barium Ferrite in current LTO cartridges. Strontium sits on a row above Barium in the periodic table, indicating a much smaller atom. This enables for much smaller particles to be placed into tracks, and thankfully according to Fujifilm, Strontium Ferrite exhibits properties along the same lines as Barium Ferrite, but moreso, enabling higher performance while simultaneously increasing particle density. [...] Fujifilm states that 400 TB is the limit of Strontium Ferrite, indicating that new materials would be needed to go beyond. That said, we are talking about only 224 Gbit-per-square-inch for storage, which compared to mechanical hard disks going beyhind 1000 Gbit-per-square-inch today, there would appear to be plenty of room at the top if the technologies could converge.

AI

Universities and Tech Giants Back National Cloud Computing Project (nytimes.com) 27

Leading universities and major technology companies agreed on Tuesday to back a new project intended to give academics and other scientists access to the computing resources now available mainly to a few tech giants. From a report: The initiative, the National Research Cloud, has received bipartisan support in both the House and the Senate. Lawmakers in both houses have proposed bills that would create a task force of government science leaders, academics and industry representatives to outline a plan to create and fund a national research cloud. This program would give academic scientists access to the cloud data centers of the tech giants, and to public data sets for research. Several universities, including Stanford, Carnegie Mellon and Ohio State, and tech companies including Google, Amazon and IBM backed the idea as well on Tuesday. The organizations declared their support for the creation of a research cloud and their willingness to participate in the project.

The research cloud, though a conceptual blueprint at this stage, is another sign of the largely effective campaign by universities and tech companies to persuade the American government to increase government backing for research into artificial intelligence. The Trump administration, while cutting research elsewhere, has proposed doubling federal spending on A.I. research by 2022. Fueling the increased government backing is the recognition that A.I. technology is essential to national security and economic competitiveness. The national cloud legislation will be proposed as an amendment to this year's defense budget authorization. "We have a real challenge in our country from China in terms of what they are doing with A.I.," said Representative Anna G. Eshoo, Democrat of California, a sponsor of the bill.

IBM

IBM's New Differential Privacy Library Works With Just a Single Line of Code (ibm.com) 45

Friday IBM Research updated their open source "IBM Differential Privacy Library," a suite of new lightweight tools offering "an array of functionality to extract insight and knowledge from data with robust privacy guarantees."

"Most tasks can be run with only a single line of code," brags a new blog post (shared by Slashdot reader IBMResearch), explaining how it works: This year for the first time in its 230-year history the U.S. Census will use differential privacy to keep the responses of its citizens confidential when the data is made available. But how does it work? Differential privacy uses mathematical noise to preserve individuals' privacy and confidentiality while allowing population statistics to be observed.

This concept has a natural extension to machine learning, where we can protect models against privacy attacks, while maintaining overall accuracy. For example, if you want to know my age (32) I can pick a random number out of a hat, say ±7 — you will only learn that I could be between 25 and 39. I've added a little bit of noise to the data to protect my age and the US Census will do something similar.

While the US government built its own differential privacy tool, IBM has been working on its own open source version and today we are publishing our latest release v0.3. The IBM Differential Privacy Library boasts a suite of tools for machine learning and data analytics tasks, all with built-in privacy guarantees. Our library is unique to others in giving scientists and developers access to lightweight, user-friendly tools for data analytics and machine learning in a familiar environment... What also sets our library apart is our machine learning functionality enables organisations to publish and share their data with rigorous guarantees on user privacy like never before...

Also included is a collection of fundamental tools for data exploration and analytics. All the details for getting started with the library can be found at IBM's Github repository.

Java

Jakarta EE 9 Specification Release 'Marks the Final Transition Away From javax Namespace' (adtmag.com) 13

An anonymous reader quotes ADTmag: The Eclipse Foundation this week announced Jakarta EE 9 Milestone 1, the final version of the enterprise Java specification before the first Release Candidate (RC). The Jakarta EE 9 release marks the final transition away from the javax.* namespace (which Oracle refused to give up) to Eclipse's jakarta.*. This release updates all the APIs to use jakarta.* in package names. In fact, Mike Milinkovich, executive director of the Eclipse Foundation, says that transition is really what this release is all about.

"The main purpose...is to provide a release that is very similar to Java EE 8," Milinkovich told ADTmag, "with everything converted to the jakarta.* namespace. We're providing a stable technical conversion platform, so all the tools and frameworks in the ecosystem that are using, say, javax.servlet, can make the change with confidence." Giving the ecosystem solid footing for the transition from the Java EE coffee cup to the Jakarta EE sailboat is the Foundation's way of setting the stage for rapid innovation, Milinkovich said, once the transition is largely complete.

"These technologies have been around for an awfully long time," he added, "and we had to provide folks with a stable platform for the conversion. At the same time, thanks to a contribution from IBM, we have the Eclipse Transformer Project, which is going to provide runtime enablement. If someone has an application they don't want to recompile, and that application is using the javax.* namespace, they will be able to run it on top of a Jakarta-compatible app server. That's going to provide binary compatibility for apps, going forward..."

Japan

ARM-Based Japanese Supercomputer is Now the Fastest in the World (theverge.com) 72

A Japanese supercomputer has taken the top spot in the biannual Top500 supercomputer speed ranking. Fugaku, a computer in Kobe co-developed by Riken and Fujitsu, makes use of Fujitsu's 48-core A64FX system-on-chip. It's the first time a computer based on ARM processors has topped the list. From a report: Fugaku turned in a Top500 HPL result of 415.5 petaflops, 2.8 times as fast as IBM's Summit, the nearest competitor. Fugaku also attained top spots in other rankings that test computers on different workloads, including Graph 500, HPL-AI, and HPCG. No previous supercomputer has ever led all four rankings at once. While fastest supercomputer rankings normally bounce between American- and Chinese-made systems, this is Japan's first system to rank first on the Top500 in nine years since Fugaku's predecessor, Riken's K computer. Overall there are 226 Chinese supercomputers on the list, 114 from America, and 30 from Japan. US-based systems contribute the most aggregate performance with 644 petaflops.
AI

'Biologically Plausible' Deep Learning Neurons Predict the Chords of Bach (ibm.com) 24

IBM's research blog shares an article about "polyphonic music prediction using the Johann Sebastian Bach chorales dataset" achieved by using "biologically plausible neurons," a new approach to deep learning "that incorporates biologically-inspired neural dynamics and enables in-memory acceleration, bringing it closer to the way in which the human brain works." At IBM Research Europe we have been investigating both Spiking Neural Networks (SNNs) and Artificial Neural Networks (ANNs) for more than a decade, and one day we were struck with the thought: "Could we combine the characteristics of the neural dynamics of a spiking neuron and an ANN?" The answer is yes, we could. More specifically, we have modelled a spiking neuron using a construct comprising two recurrently-connected artificial neurons — we call it a spiking neural unit (SNU)... It enables a reuse of architectures, frameworks, training algorithms and infrastructure. From a theoretical perspective, the unique biologically-realistic dynamics of SNNs become available for the deep learning community...

Furthermore, a spiking neural unit lends itself to efficient implementation in artificial neural network accelerators and is particularly well-suited for applications using in-memory computing. In-memory computing is a promising new approach for AI hardware that takes inspiration from the architecture of the brain, in which memory and computations are combined in the neurons. In-memory computing avoids the energy cost of shuffling data back and forth between separate memory and processors by performing computations in memory — phase change memory technology is a promising candidate for such implementation, which is well understood and is on its way to commercialization in the coming years. Our work involves experimental demonstration of in-memory spiking neural unit implementation that exhibits a robustness to hardware imperfections that is superior to that of other state-of-the-art artificial neural network units...

The task of polyphonic music prediction on the Johann Sebastian Bach dataset was to predict at each time step the set of notes, i.e. a chord, to be played in the consecutive time step. We used an SNU-based architecture with an output layer of sigmoidal neurons that allows a direct comparison of the obtained loss values to these from ANNs. The SNU-based network achieved an average loss of 8.72 and set the SNN state-of-the-art performance for the Bach chorales dataset. An sSNU-based network further reduced the average loss to 8.39 and surpassed corresponding architectures using state-of-the-art ANN units.

Slashdot reader IBMResearch notes that besides being energy-efficient, the results "point towards the broad adoption of more biologically-realistic deep learning for applications in artificial intelligence."
Microsoft

Microsoft Won't Sell Police Its Facial-Recognition Technology (washingtonpost.com) 48

Microsoft joined the list of tech giants who've decided to limit the use of its facial-recognition systems, announcing that it will not sell the controversial technology to police departments until there is a federal law regulating the technology. From a report: The move, announced by Microsoft president Brad Smith at a Post Live event Thursday morning, follows similar decisions by Amazon and IBM, and comes as protesters across the nation press for an end to police brutality and racial profiling. Smith said that Microsoft has not sold its facial-recognition technology to law enforcement. "We will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights that will govern this technology," Smith said. The company also plans to put in place "review factors" that Smith said would "go even beyond what we already have" to determine the use of the technology beyond law enforcement.
Software

Amazon Pauses Police Use of Facial Recognition Tech For a Year (reuters.com) 32

An anonymous reader quotes a report from Reuters: Amazon on Wednesday said it was implementing a one-year moratorium on police use of its facial recognition software, reversing its long-time support of selling the technology to law enforcement. Civil liberties activists have voiced concern that facial recognition could lead to unjust arrests during demonstrations against police brutality, racial injustice and the death of George Floyd. Critics also have questioned the software's accuracy, pointing to a past study showing Amazon's "Rekognition" service struggled to identify the gender of individuals with darker skin tones. Amazon has taken issue with that research.

The company, which sells cloud computing technology via its Amazon Web Services division, said in a statement it has pushed for regulations to ensure the software was used ethically. "We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested," Amazon said. The company said it would continue to permit use of its technology by customers Thorn and Marinus Analytics to help law enforcement find human trafficking victims.
The decision follows IBM's announcement earlier this week that it was getting out of the facial recognition business entirely. It's also urging Congress to enact reforms to advance racial justice and combat systemic racism.
Privacy

IBM Gets Out of Facial Recognition Business, Calls On Congress To Advanced Policies Tackling Racial Injustice (cnbc.com) 70

IBM CEO Arvind Krishna called on Congress Monday to enact reforms to advance racial justice and combat systemic racism while announcing the company was getting out of the facial recognition business. CNBC reports: "IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency," Krishna wrote in the letter delivered to members of Congress late Monday. "We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies."

IBM decided to shut down its facial recognition products and announce its decision as the death of George Floyd brought the topic of police reform and racial inequity into the forefront of the national conversation, a person familiar with the situation told CNBC. IBM's facial recognition business did not generate significant revenue for the company, the person familiar with the situation said, but the decision remains notable for a technology giant that counts the U.S. government as a major customer. The decision was both a business and an ethical one, the person familiar with the situation said. The company heard in the past few weeks concerns from many constituencies, including employees, about its use of the technology, the person added.

"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe. But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularly when used in law enforcement, and that such bias testing is audited and reported," Krishna wrote. The letter was addressed to sponsors and co-sponsors of a sweeping police reform bill unveiled by Democrats Monday -- Black Caucus Chair Rep. Karen Bass (D-CA), House Judiciary Committee Chair Rep. Jerry Nadler (D-NY), Sen. Corey Booker (D-NJ), Sen. Kamala Harris (D-CA) and Rep. Hakeem Jeffries (D-NY).

Slashdot Top Deals