×
IBM

Robert Cringley Predicted 'The Death of IT' in 2020. Was He Right? (cringely.com) 54

Yesterday long-time tech pundit Robert Cringley reviewed the predictions he'd made at the beginning of last year. "Having done this for over 20 years, historically I'm correct abut 70 percent of the time, but this year could be a disappointment given that I'm pretty sure I didn't predict 370,000 deaths and an economy in free-fall.

"We'll just have to see whether I was vague enough to get a couple right."

Here's some of the highlights: I predicted that IBM would dump a big division and essentially remake itself as Red Hat, its Linux company. Well yes and no. IBM did announce a major restructuring, spinning-off Global Technology Services just as I predicted (score one for me) but it has all happened slowly because everything slows down during a pandemic. The resulting company won't be called Red Hat (yet), but the rest of it was correct so I'm going to claim this one, not that anybody cares about IBM anymore...

I predicted that working from home would accelerate a trend I identified as the end of IT, by which I meant the kind of business IT provided and maintained by kids from that office in the basement. By working from home, we'd all become our own IT guys and that would lead to acceleration in the transition of certain technologies, especially SD-WAN and Secure Access Service Edge (SASE)... "That's the end-game if there is one — everything in the cloud with your device strictly for input and output, painting screens compressed with HTML5. It's the end of IT because your device will no longer contain anything, so it can be simply replaced via Amazon if it is damaged or lost, with the IT kid in the white shirt becoming an Uber driver (if any of those survive)."

It was a no-brainer, really, and I was correct: Internet-connected hardware sales surged, SASE took over whether you even knew it or not, and hardly any working from home was enabled by technology owned by the business, itself. It's key here that the operant term for working from home became "Zooming" — a third-party public brand built solely in the cloud.

Finally, I predicted that COVID-19 would accelerate the demise of not just traditional IT, but also IT contractors, because the more things that could be done in the cloud the less people would be required to do them. So what actually happened? Well I was right about the trend but wrong about the extent. IT consulting dropped in 2020 by about 19 percent, from $160 billion to $140 billion. That's a huge impact, but I said "kill" and 19 percent isn't even close to dead. So I was wrong.

Communications

IBM Will Pay $24.25 Million To Resolve FCC Probes (reuters.com) 13

IBM has agreed to pay $24.25 million to resolve a pair of investigations by the Federal Communications Commission(FCC) over subsidies awarded to connect schools and libraries to broadband. From a report: IBM's payment will resolve two FCC investigations that have spanned nearly 15 years over its alleged violations of "E-Rate" program rules in connection with New York City and El Paso school districts. Under the agreement, IBM agreed to return $24.25 million to the Universal Service Fund that funds the E-Rate program, but did not admit wrongdoing. The FCC said its investigations found IBM had not satisfied the competitive bidding rules in New York for 2005-2008 and provided ineligible equipment and services in El Paso for 2001. IBM said in a statement it believed it "acted appropriately in its support of the E-rate program, but in the interest of amicably resolving a longstanding matter we are pleased to have reached this settlement." IBM added it had provided internet connectivity to thousands of U.S. schools and libraries and millions of American students through the E-rate program.
Technology

Quantum Computer Makers Like Their Odds For Big Progress (cnet.com) 31

For years, quantum computing has been the preserve of academics. New advances, however, are pushing this potentially revolutionary technology toward practical applications. From a report: At the Q2B conference this month, quantum computer makers Google, IBM, Honeywell, IonQ and Xanadu detailed specific steps they expect by 2024 that will push their machines further down the road of commercial practicality. Those achievements include increasing quantum computers' scale, performance and reliability. Private sector spending on quantum computing products and services will likely more than triple to $830 million in 2024, up from $250 million in 2019, according to a forecast from Hyperion Research.

"We're in the early industrial era of quantum computing," said Seth Lloyd, an MIT professor who helped found the field in the 1990s. He says the "huge advances" are comparable to the early use of steam engines to power factories, ships and trains. One buzzworthy breakthrough is progress toward error correction, which should let quantum computers perform sustained calculations instead of fleeting spurts of work. That improvement comes through overcoming a fundamental limit with qubits, the basic elements for storing and processing data in a quantum computer. Qubits are easily perturbed by outside forces, but error correction is designed to overcome the finickiness of individual qubits. It'll require bigger machines with many more qubits, but quantum computer makers see progress there, too.

Data Storage

Fujifilm and IBM Set World Record With 580TB Magnetic Tapes (pcmag.com) 53

An anonymous reader quotes a report from PCMag.com: Mark Lantz, Manager of Advanced Tape Technologies at IBM Research, explains how researchers at IBM and Fujifilm brought together more than 15 years of work to set a new world record in tape storage. What they achieved is an areal density of 317Gbpsi (gigabits per square inch), which translates to a single tape capable of storing 580 terabytes of data. In order to achieve such a high areal density, the research team had to develop a brand new tape and created Strontium Ferrite (SrFe) in the process. Existing magnetic tapes rely on Barium Ferrite (BaFe), but SrFe offers the potential for higher density storage in the same amount of tape. Alongside that, the team also "developed a family of new servo-mechanical technologies including a new servo pattern that is pre-recorded in the servo tracks, a prototype head actuator and a set of servo controllers." The end result is a very high capacity tape that can be read while moving at a speed of 15km/h (9.3mph).
The Almighty Buck

'Evil Mobile Emulator Farms' Used To Steal Millions From US and EU Banks (arstechnica.com) 59

An anonymous reader quotes a report from Ars Technica: Researchers from IBM Trusteer say they've uncovered a massive fraud operation that used a network of mobile device emulators to drain millions of dollars from online bank accounts in a matter of days. The scale of the operation was unlike anything the researchers have seen before. In one case, crooks used about 20 emulators to mimic more than 16,000 phones belonging to customers whose mobile bank accounts had been compromised. In a separate case, a single emulator was able to spoof more than 8,100 devices.

The thieves then entered usernames and passwords into banking apps running on the emulators and initiated fraudulent money orders that siphoned funds out of the compromised accounts. Emulators are used by legitimate developers and researchers to test how apps run on a variety of different mobile devices. To bypass protections banks use to block such attacks, the crooks used device identifiers corresponding to each compromised account holder and spoofed GPS locations the device was known to use. The device IDs were likely obtained from the holders' hacked devices, although in some cases, the fraudsters gave the appearance they were customers who were accessing their accounts from new phones. The attackers were also able to bypass multi-factor authentication by accessing SMS messages.

Security

Mysterious Phishing Campaign Targets Organizations in COVID-19 Vaccine Cold Chain (zdnet.com) 20

IBM's cyber-security division says that hackers are targeting companies associated with the storage and transportation of COVID-19 vaccines using temperature-controlled environments -- also known as the COVID-19 vaccine cold chain. From a report: The attacks consisted of spear-phishing emails seeking to collect credentials for a target's internal email and applications. While IBM X-Force analysts weren't able to link the attacks to a particular threat actor, they said the phishing campaign showed the typical "hallmarks of nation-state tradecraft." Targets of the attacks included a wide variety of companies, sectors, and government organizations alike.
IBM

IBM Planning 10,000 Job Cuts in Europe Ahead of Unit Sale (bloomberg.com) 44

International Business Machines is planning to cut about 10,000 jobs in Europe in an attempt to lower costs at its slow-growth services unit and prepare the business for a spinoff. From a report: The wide-ranging losses will affect about 20% of staff in the region, according to people familiar with the matter. The U.K. and Germany are set to be most impacted, with cuts also planned in Poland, Slovakia, Italy and Belgium. IBM announced the job cuts in Europe earlier in November during a meeting with European labor representatives, according to a union officer briefed on proceedings. The person asked not to be identified because the talks are private. IBM shares fell 1.6% at 9:37 a.m. in New York. They've declined 8.6% this year.
Desktops (Apple)

Linus Torvalds Would Like To Use An M1 Mac For Linux, But... (zdnet.com) 246

Yes, Torvalds said he'd love to have one of the new M1-powered Apple laptops, but it won't run Linux and, in an exclusive interview he explains why getting Linux to run well on it isn't worth the trouble. Steven J. Vaughan-Nichols writes via ZDNet: Recently, on the Real World Technologies forum, Linux's creator Linus Torvalds was asked what he thought of the new M1-powered Apple laptops. Torvalds replied, "I'd absolutely love to have one if it just ran Linux." You may think, "what's the problem? Doesn't Linux run on practically every processor on the planet from 80386s to IBM s390x to the ARM family of which Apple's M1 chip is a child?" Well, yes, yes it does. But it takes more than a processor to run a computer.

Torvalds would like to run Linux on these next-generation Macs. As he said, "I've been waiting for an ARM laptop that can run Linux for a long time. The new Air would be almost perfect, except for the OS. And I don't have the time to tinker with it, or the inclination to fight companies that don't want to help." Aye, there's the rub. In an exclusive interview, Torvalds expanded on why he can't see porting Linux to the M1-based Macs. "The main problem with the M1 for me is the GPU and other devices around it, because that's likely what would hold me off using it because it wouldn't have any Linux support unless Apple opens up."

Still, while Torvalds knows Apple opening up their chipsets "seems unlikely, but hey, you can always hope." Even if that "wasn't an issue," Torvalds continued, "My personal hope would be more cores. Even in a laptop, I don't care about 20-hour battery life (and I wouldn't get it building kernels anyway). I'd rather plug it in a bit more often, and have 8 big cores." As for the Mac's limited RAM -- no more than 16GBs on current models -- he can live with that. "16GBs is actually ok by me because I don't tend to do things that require a lot more RAM. All I do is read email, do git and kernel compiles. And yes, I have 64GB in my desktop, but that's because I have 32 cores and 64 threads, and I do hugely parallel builds. Honestly, even then 32GB would be sufficient for my loads." That said, other developers and power users may want more from the new Macs, Torvalds thinks. "The people who really want tons of memory are the ones doing multiple VMs or huge RAW file photography and video."

Television

Who Will Be the First Guest Host of Jeopardy, Alex? (variety.com) 70

thomst writes: Variety is reporting that uber-champion Ken Jennings will be the first of a series of guest hosts to substitute for the late, great Alex Trebek on trivia-maven game show Jeopardy! Executive Producer Mike Richards revealed that, when production resumes on November 30, Jennings will be the first of a series of guest hosts of the program, as the show begins its search for a permanent replacement for the much-beloved Trebek.

Odds are good that the "beauty pageant"-style guest host format will, in effect, be a series of auditions for the permanent position. Jennings, who is legendary for the number of games he won as a regular contestant, as well as for triumphing over fellow Tournament of Champions contestants, IBM's Deep Blue expert system, and two other "winning-est" players to be crowned Greatest of All Time, has hosted trivia game shows in the past, and has made no secret of his desire to take the Trebek's job full-time.

As the saying goes, "Stay tuned for more on this story!"

Privacy

Ask Slashdot: Why Haven't We Implemented Public Key Infrastructure Voting? 433

Long-time Slashdot reader t0qer has a question: why haven't we gone to an open source, Public Key Infrastructure-based voting system? "I'm fairly well versed in PKI technology, and quoting this site, it would take traditional computers 300 trillion years to break RSA-2048 for a single vote." SSL.com has a pretty interesting piece on using Public Key Infrastructure in voting. There's also a GitHub project that leverages PKI and IBM blockchain technology...

It just seems like paper at this point has outlived its secureness. A closed sourced voting system doesn't really seem like the kind of thing Slashdot would really get behind.

SSL's article points out that the technology seems to exist already. Nearly half the population of Estonia already votes online, and four U.S. states (Arizona, Colorado, Missouri and North Dakota) already have web portals that allow for absentee voting. (And West Virginia has a mobile voting app that uses blockchain technology.) [L]uckily, the groundwork for securing the practice of remote, online voting is already there. We have been conducting many delicate transactions online for some time — the secure transfer of information has been a cornerstone for many industries that have successfully shifted online such as personal banking and investing, and those methods of securing and authenticating information can be employed in voting as well. For years, people have suggested that the use of blockchain technology could be used to secure elections and increase voter turnout.
Share your own thoughts in the comments. Why haven't we implemented Public Key Infrastructure voting?
Government

US Congress Passes an IoT Security Bill 'That Doesn't Totally Suck' (theregister.com) 80

Shotgun (Slashdot reader #30,919) shared these thoughts from The Register: Every now and again the U.S. Congress manages to do its job and yesterday was one of those days: the Senate passed a new IoT cybersecurity piece of legislation that the House also approved, and it will now move to the President's desk.

As we noted back in March when the IoT Cybersecurity Improvement Act was introduced, the law bill is actually pretty good: it asks America's National Institute of Standards and Technology (NIST) to come up with guidelines for Internet-of-Things devices and would require any federal agency to only buy products from companies that met the new rules. It gives a minimum list of considerations to be covered: secure code, identity management, patching and configuration management. It also requires the General Services Administration — the arm of the federal government that sources products and comms for federal agencies — to come up with guidelines that would require each agency to report and publish details of security vulnerabilities, and how they resolved them, and coordinate with other agencies.

Industry has also got behind the effort — Symantec, Mozilla, BSA The Software Alliance (which includes Apple, Microsoft, IBM, Cloudflare, the CTIA and others) — and Congress has managed to keep its fingers out of things it knows nothing about by leaving the production of standards with the experts, using federal procurement to create a de facto industry standard.

Though it will still be legal sell insecure IoT devices, "for those looking for good, secure products, there will be a baseline standard across the industry..." the article concludes.

"[T]his is an essential first step to getting secure IoT in place."
Software

Tech Organizations Back 'Inclusive Naming Initiative' (theregister.com) 264

New submitter LeeLynx shares a report from The Register: A new group called the "Inclusive Naming Initiative" has revealed its existence and mission "to help companies and projects remove all harmful and unclear language of any kind and replace it with an agreed-upon set of neutral terms." Akamai, Cisco, the Cloud Native Computing Foundation, IBM, the Linux Foundation, Red Hat, and VMware are all participants. The group has already offered a Word replacement list that suggests alternatives to the terms whitelist, blacklist, slave, and master. There's also a framework for evaluating harmful language that offers guidance on how to make changes.

Red Hat's post announcing its participation in the Initiative links to a dashboard listing all instances of terms it wants changed and reports over 330,000 uses of "Master" and 105,000 uses of "Slave," plus tens of thousands and whitelists and blacklists. Changing them all will be a big job, wrote Red Hat's senior veep and CTO Chris Wright. "On a technical level, change has to be made in hundreds of discrete communities, representing thousands of different projects across as many code repositories," Wright wrote. "Care has to be taken to prevent application or API breakage, maintain backward compatibility, and communicate the changes to users and customers." The Initiative nonetheless hopes to move quickly, with its roadmap calling for best practices to be defined during Q1 2021, case studies to be available in Q3 2021 and a certification program delivered in Q4 2021.

Businesses

IBM Apologizes For Firing Computer Pioneer For Being Transgender... 52 Years Later (forbes.com) 164

On August 29, 1968, IBM's CEO fired computer scientist and transgender pioneer Lynn Conway to avoid the public embarrassment of employing a transwoman. Nearly 52 years later, in an act that defines its present-day culture, IBM is apologizing and seeking forgiveness. Jeremy Alicandri writes via Forbes reports: On January 2, 1938, Lynn Conway's life began in Mount Vernon, NY. With a reported IQ of 155, Conway was an exceptional and inquisitive child who loved math and science during her teens. She went on to study physics at MIT and earned her bachelor's and master's degrees in electrical engineering at Columbia University's Engineering School. In 1964, Conway joined IBM Research, where she made major innovations in computer design, ensuring a promising career in the international conglomerate (IBM was the 7th largest corporation in the world at the time). Recently married and with two young daughters, she lived a seemingly perfect life. But Conway faced a profound existential challenge: she had been born as a boy.
[...]
[W]hile IBM knew of its key role in the Conway saga, the company remained silent. That all changed in August 2020. When writing an article on LGBTQ diversity in the automotive industry, I included Conway's story as an example of the costly consequences to employers that fail to promote an inclusive culture. I then reached out to IBM to learn if its stance had changed after 52 years. To my surprise, IBM admitted regrets and responsibility for Conway's firing, stating, "We deeply regret the hardship Lynn encountered." The company also explained that it was in communication with Conway for a formal resolution, which came two months later. Arvind Krishna, IBM's CEO, and other senior executives had determined that Conway should be recognized and awarded "for her lifetime body of technical achievements, both during her time at IBM and throughout her career."

Dario Gil, Director of IBM Research, who revealed the award during the online event, says, "Lynn was recently awarded the rare IBM Lifetime Achievement Award, given to individuals who have changed the world through technology inventions. Lynn's extraordinary technical achievements helped define the modern computing industry. She paved the way for how we design and make computing chips today -- and forever changed microelectronics, devices, and people's lives." The company also acknowledged that after Conway's departure in 1968, her research aided its own success. "In 1965 Lynn created the architectural level Advanced Computing System-1 simulator and invented a method that led to the development of a superscalar computer. This dynamic instruction scheduling invention was later used in computer chips, greatly improving their performance," a spokesperson stated.

Japan

Japan's ARM-Based Supercomputer Leads World In Top500 List; Exascale Expected In 2021 (techtarget.com) 25

dcblogs writes: Japan's Fugaku ARM-based supercomputer is the world's most powerful in the latest Top500 list, setting a world record of 442 petaflops. But this was otherwise an unremarkable year for supercomputers, with a "flattening performance curve," said Jack Dongarra, one of the academics behind the twice-a-year ranking and director of the Innovative Computing Laboratory at the University of Tennessee. This is a result of Moore's Law slowing down as well as a slowdown in the replacement of older systems, he said. But the U.S. is set to deliver an exascale system -- 1,000 petaflops -- next year and China as well. Meanwhile, the EU has a 550 petaflop system in development in Finland. "On the Top500 list, the second-ranked system was IBM Power Systems at nearly 149 petaflops using its Power9 CPUs and Nvidia Tesla GPUs. It is at the Oak Ridge National Lab in Tennessee," adds TechTarget.

"Third place went to Sierra supercomputer, which also uses Power9 and Nvidia GPUs, at about 95 petaflops. It is at Lawrence Livermore National Laboratory in Livermore, Calif."
Medicine

Ticketmaster To Require Negative COVID-19 Test Or Vaccination To Attend Concerts 152

Ticketmaster is planning to check the coronavirus vaccination status of concert-goers prior to shows once a treatment is approved. The New York Post reports: The ticketing giant plans to have customers use their cellphones to verify their inoculation or whether they've tested negative for the virus within a 24- to 72-hour window, according to the exclusive report. The plan, which is still being ironed out, will utilize three separate components, including the California-based company's digital ticketing app, third-party health information firms like CLEAR Health Pass and testing/vaccination distributors like Labcorp or CVS Minute Clinic.

Ticketmaster will reportedly not store or access medical records under the plan. If approved, fans would need to verify that they've either already been vaccinated or have tested negative as recently as 24 hours prior to the show. Concert-goers would then instruct a lab to send over test results to companies like CLEAR Health Pass or IBM's Digital Health Pass, which would verify the fan's status to Ticketmaster. Anyone who tests positive or doesn't get screened won't be granted access to the event venue, Billboard reported. The digital ticketing app will also eliminate the need for paper tickets and can be prohibited from being resold, according to the report.
Further reading: Billboard
Television

Iconic American Quiz Show Host Alex Trebek Dies at Age 80 (cnn.com) 71

The iconic host of America's TV quiz show Jeopardy! has died at age 80, after 35 years of reminding contestants to always state their answers in the form of a question. Who was Alex Trebek? CNN calls him "the genial host with all the answers and a reassuring presence in the TV game-show landscape for five decades."

Notable moments include playing host to IBM's Watson computer and a teasingly blunt appraisal of nerdcore hip hop. CNN reports: "Jeopardy! is saddened to share that Alex Trebek passed away peacefully at home early this morning, surrounded by family and friends," said a statement shared on the show's Twitter account Sunday. "Thank you, Alex." The cause of death was not immediately announced. Trebek revealed in March 2019 he had been diagnosed with stage 4 pancreatic cancer, triggering an outpouring of support and well wishes at the time.

Trebek made history in 2014 by hosting his 6,829th "Jeopardy!" episode — the most by a presenter of any single TV game show... In a 2014 interview with the Hollywood Reporter, Trebek downplayed that honor, saying, "I'm just enjoying what I'm doing, I'm happy to have a job. I like the show, I like the contestants and it pays well."

AI

Pope Francis Prays for Good AI (axios.com) 78

For his monthly intention in November, Pope Francis prayed that AI will be beneficial for humanity. From a report: It's up for debate whether the development of automation and AI will ultimately be good for humankind, and it can't hurt to have a little divine intervention on our side. What he's saying: "We pray that the progress of robotics and artificial intelligence may always serve humankind," reads Francis' intention for November, which is published each month by the Pope's Worldwide Prayer Network. Background: This isn't the first time Francis has ventured into the fraught territory of AI ethics and alignment. In February, the Vatican hosted executives from IBM and Microsoft for a summit on "human-centered" ways of designing AI. They formulated the "Rome Call for AI Ethics," which called for AI to be designed with a focus on the good of the environment and "our common and shared home and of its human inhabitants."
IBM

The Untimely Demise Of Workstations (deprogrammaticaipsum.com) 122

Graham Lee, writing at De Programmatica Ipsum: Last month's news that IBM would do a Hewlett-Packard and divide into two -- an IT consultancy and a buzzword compliance unit -- marks the end of "business as usual" for yet another of the great workstation companies. [...] In high-tech domains, an engineer could readily have a toolchest of suitable computers in the same way that a mechanic has different tools for their tasks. This one has an FPGA connected by both PCI-E and JTAG to allow for quick hardware prototyping. This one is connected to a high-throughput GPU for visualisations; that one to a high-capacity GPU for scientific simulations. The general purpose hardware vendors want us to believe that an okay-at-anything computer is the best for everything: you don't need a truck, so here's a car. But when you're hauling a ton of goods, you'll find it cheaper and more satisfying to shell out more for a truck. Okay-at-anything is good for nothing.
Technology

Honeywell Announces its H1 Quantum Computer with 10 Qubits (techcrunch.com) 14

New submitter B1948J writes: It's interesting that Honeywell is once again emerging as a computing hardware vendor. Over 50 years ago, Honeywell established itself as one of the big 3 computer mainframe manufacturers (Honeywell, Burroughs and IBM). Some say the Honeywell-200 introduced "channel architecture" before IBM "announced" it for their 360 series computers. Now, Honeywell is announcing its H1 Quantum Computer capable of a Quantum Value of 128 through 10 fully connected qubits.
Linux

Linux 5.9 Boosts CPU Performance With FSGSBASE Support (phoronix.com) 75

FSGSBASE support in Linux "has the possibility of helping Intel/AMD CPU performance especially in areas like context switching that had been hurt badly by Spectre/Meltdown and other CPU vulnerability mitigations largely on the Intel side," Phoronix wrote back in August. As it started its journey into the kernel, they provided a preview on August 10: The FSGSBASE support that was finally mainlined a few days ago for Linux 5.9 is off to providing a nice performance boost for both Intel and AMD systems... FSGSBASE support for the Linux kernel has been around a half-decade in the making and finally carried over the finish line by one of Microsoft's Linux kernel engineers...

FSGSBASE particularly helps out context switching heavy workloads like I/O and allowing user-space software to write to the x86_64 GSBASE without kernel interaction. That in turn has been of interest to Java and others...On Linux 5.9 where FSGSBASE is finally mainlined, it's enabled by default on supported CPUs. FSGSBASE can be disabled at kernel boot time via the "nofsgsbase" kernel option.

Today on the Linux kernel mailing list, Linus Torvalds announced the release of Linux 5.9: Ok, so I'll be honest - I had hoped for quite a bit fewer changes this last week, but at the same time there doesn't really seem to be anything particularly scary in here. It's just more commits and more lines changed than I would have wished for.
And Phoronix reported: Linux 5.9 has a number of exciting improvements including initial support for upcoming Radeon RX 6000 "RDNA 2" graphics cards, initial Intel Rocket Lake graphics, NVMe zoned namespaces (ZNS) support, various storage improvements, IBM's initial work on POWER10 CPU bring-up, the FSGSBASE instruction is now used, 32-bit x86 Clang build support, and more. See our Linux 5.9 feature overview for the whole scoop on the many changes to see with this kernel.

Slashdot Top Deals