IBM

McDonald's Partners With IBM To Automate Drive-Thru Lanes (cnbc.com) 118

McDonald's said Wednesday it has entered a strategic partnership with IBM to develop artificial intelligence technology that will help the fast-food chain automate its drive-thru lanes. CNBC reports: As part of the deal, IBM will acquire McD Tech Labs, which was formerly known as Apprente before McDonald's bought the tech company in 2019. McDonald's didn't disclose financial terms for either transaction. "In my mind, IBM is the ideal partner for McDonald's given their expertise in building AI-powered customer care solutions and voice recognition," McDonald's CEO Chris Kempczinski said on the earnings call with analysts Wednesday.

The Apprente technology uses AI to understand drive-thru orders. This summer, McDonald's tested the tech in a handful of Chicago restaurants. Kempczinski said that the test showed "substantial benefits" to customers and employees. In June, at the same conference where he disclosed the Chicago test, Kempczinski shared McDonald's strategy for tech acquisitions. "If we do acquisitions, it will be for a short period of time, bring it in house, jumpstart it, turbo it and then spin it back out and find a partner that will work and scale it for us," he said. CFO Kevin Ozan said that less than 100 employees will leave McDonald's to work for IBM.

Cloud

Alliance Including Amazon, Google, Microsoft, and IBM Vows to Protect Rights and Privacy With 'Trusted Cloud Principles' (zdnet.com) 33

ZDNet reports: Some of the world's largest tech giants — Amazon, Google, Microsoft, IBM, Salesforce/Slack, Atlassian, SAP, and Cisco — have joined forces to establish the Trusted Cloud Principles in what they are claiming is their commitment to protecting the rights of their customers... Some of the specific principles that have been founded by the signatories include governments should seek data directly from enterprise customers first, rather than cloud providers, other than in "exceptional circumstances"; customers should have a right to notice when governments seek to access customer data directly from cloud service providers; and there should be a clear process for cloud providers to challenge government access requests for customers' data, including notifying relevant data protection authorities, to protect customers' interests.

Also outlined in the principles is the point that governments should create mechanisms to raise and resolve conflicts with each other such that cloud service providers' legal compliance in one country does not amount to a violation of law in another; and governments should support cross-border data flows. At the same time, the cloud service providers acknowledge that under the principles they recognise international human rights law enshrines a right to privacy, and the importance of customer trust and customers' control and security of their data. The signatories also said they commit to supporting laws that allow governments to request data through a transparent process that abides by human right standards; international legal frameworks to resolve conflicting laws related to data access, privacy, and sovereignty; and improved rules and regulations at the national and international levels that protect the safety, privacy, and security of cloud customers and their ownership of data...

The Trusted Cloud Principles come days after a separate data cloud framework was stood up between Amazon Web Services, Google, IBM, Microsoft and other major tech giants, plus the EDM Council, a cross-industry trade association for data management and analytics. Under the Cloud Data Management Capabilities (CDMC) framework there are six components, 14 capabilities, and 37 sub-capabilities that sets out cloud data management capabilities, standards, and best practices for cloud, multi-cloud, and hybrid-cloud implementations while also incorporating automated key controls for protecting sensitive data.

IBM

After IBM Failed To Sail an Autonomous Boat Across the Atlantic, It's Trying Again (washingtonpost.com) 69

After failing its first attempt to re-create the Mayflower's voyage across the Atlantic Ocean, a crewless ocean vessel, powered by artificial intelligence, has returned to sea. From a report: Propelled by IBM's AI software, the autonomous ship set out in June for a month-long excursion through rough waters with no humans aboard. However, three days into what was supposed to be a monumental journey from Plymouth, England, to Plymouth, Mass., where pilgrim travelers settled in 1620, the robot ship suffered "a minor mechanical issue" according to ProMare, a nonprofit promoting marine research that is behind the project. Researchers pushed out a software update, signaling for the ship to reverse course. The boat abided by its orders and headed to shore. Yet according to Brett Phaneuf, co-director of the Mayflower Autonomous Ship Project, the organizers quickly began planning another voyage. "We've had a setback, but one that will put us further ahead than if we did nothing," he said. Earlier this month, researchers sent the ship back out for a shorter trip: This time it'll focus on the waters around the United Kingdom, where crews can attend to it sooner if something unforeseen happens. "At some point, you have to go for it and take the risk or never improve," Phaneuf said.
Businesses

The IT Talent Gap is Still Growing (venturebeat.com) 109

IT executives see the talent shortage as the most significant adoption barrier to 64% of emerging technologies, according to a new Gartner survey. From a report: Across compute infrastructure and platform services, network, security, digital workplace, IT automation, and storage and database, respondents cited a lack of qualified candidates as a leading factor impeding tech deployment at their companies. "The ongoing push toward remote work and the acceleration of hiring plans in 2021 has exacerbated IT talent scarcity, especially for sourcing skills that enable cloud and edge, automation, and continuous delivery," Gartner research VP Yinuo Geng said in a press release.

"As one example, of all the IT automation technologies profiled in the survey, only 20% of them have moved ahead in the adoption cycle since 2020. The issue of talent is to blame here." The talent gaps are particularly acute for IT automation and digital workplace solutions, according to the executives surveyed -- a reflection of the demand for these technologies. According to McKinsey, nearly half of executives say their embrace of automation has accelerated, while digital and technology adoption is taking place about 25 times faster than before the pandemic. For example, Brain Corp reported that the use of robots to clean retail stores in the U.S. rose 24% in Q2 2020 year-over-year, and IBM has seen a surge in new users of its AI-driven customer service platform Watson Assistant.

Robotics

Astronauts In Space Will Soon Resurrect An AI Robot Friend Called CIMON (space.com) 17

A robot called CIMON-2 (short for Crew Interactive Mobile Companion) has received a software update that will enable it to perform more complex tasks with a new human crewmate later this year. Space.com reports: The cute floating sphere with a cartoon-like face has been stored at the space station since the departure of the European Space Agency's (ESA) astronaut Luca Parmitano in February 2020. The robot will wake up again during the upcoming mission of German astronaut Matthias Maurer, who will arrive at the orbital outpost with the SpaceX Crew-3 Dragon mission in October. In the year and a half since the end of the last mission, engineers have worked on improving CIMON's connection to Earth so that it could provide a more seamless service to the astronauts, CIMON project manager Till Eisenberg at Airbus, which developed the intelligent robot together with the German Aerospace Centre DLR and the LMU University in Munich, told Space.com.

"The sphere is just the front end," Eisenberg said. "All the voice recognition and artificial intelligence happens on Earth at an IBM data centre in Frankfurt, Germany. The signal from CIMON has to travel through satellites and ground stations to the data centre and back. We focused on improving the robustness of this connection to prevent disruptions." CIMON relies on IBM's Watson speech recognition and synthesis software to converse with astronauts and respond to their commands. The first generation robot flew to the space station with Alexander Gerst in 2018. That robot later returned to Earth and is now touring German museums. The current robot, CIMON-2, is a second generation. Unlike its predecessor, it is more attuned to the astronauts' emotional states (thanks to the Watson Tone Analyzer). It also has a shorter reaction time.

Airbus and DLR have signed a contract with ESA for CIMON-2 to work with four humans on the orbital outpost in the upcoming years. During those four consecutive missions, engineers will first test CIMON's new software and then move on to allowing the sphere to participate in more complex experiments. During these new missions CIMON will, for the first time, guide and document complete scientific procedures, Airbus said in a statement. "Most of the activities that astronauts perform are covered by step by step procedures," Eisenberg said. "Normally, they have to use clip boards to follow these steps. But CIMON can free their hands by floating close by, listening to the commands and reading out the procedures, showing videos, pictures and clarifications on its screen." The robot can also look up additional information and document the experiments by taking videos and pictures. The scientists will gather feedback from the astronauts to see how helpful the sphere really was and identify improvements for CIMON's future incarnations.

IBM

IBM's New Mainframe 7nm CPU Telum: 16 Cores At 5GHz, Virtual L3 and L4 Cache (arstechnica.com) 90

Long-time Slashdot reader UnknowingFool writes: Last week IBM announced their next generation mainframe CPU Telum. Manufactured by Samsung's 7nm node, each Telum processor has 8 cores with each core running at a base 5GHz. Two processors are combined in a package similar to AMD's chiplet design. A drawer in each mainframe can hold 4 packages (sockets), and the mainframe can hold 4 drawers for combined 256 cores.

Different from previous generations, there is no dedicated L3 or L4 cache. Instead each core has a 32MB L2 cache that can pool to become a 256MB L3 "virtual" cache on the same processor or 2GB L4 "virtual" cache on the same drawer. Also included to help with AI is a on-die but not on-core inference accelerator running at 6TFLOPS using Intel's AVX-512 to communicate with the cores.

Education

Code.org Will Teach 'Cybersecurity Hygiene' to Millions of Students 29

Long-time Slashdot reader theodp writes: Mr. President," Code.org founder Hadi Partovi told President Joe Biden and tech CEOs from Microsoft, Amazon, Google, Apple, and IBM at Wednesday's Presidential Summit on Cybersecurity, "America's cybersecurity problem is an education problem. I loved [Microsoft CEO] Satya Nadella's wonderful analogy to the car industry, and like Satya said, we need standards for seatbelts in every car for sure. But if none of the drivers took a course in basic safety skills, our roads could never, ever be safe. That's the current state of affairs on the roads of the internet. Without proper education, we can't address our nation's weakest link. If you look around, every CEO is nodding their head because they know we need a plan to educate every American on basic cyber security hygiene, and also a plan to staff up our cyber defense workforce. This needs to start early, in K-12, and reach everybody."

A newly-released White House Fact Sheet announcing "Ambitious Initiatives to Bolster the Nation's Cybersecurity" notes that tech-bankrolled "Code.org announced it will teach cybersecurity concepts to over 3 million students across 35,000 classrooms over 3 years, to teach a diverse population of students how to stay safe online, and to build interest in cybersecurity as a potential career."
Security

How Microsoft, Google, Apple, and IBM Will Help the US Improve Its Cybersecurity (infosecurity-magazine.com) 19

Infosecurity magazine reports: Some of the world's biggest tech companies have committed tens of billions of dollars to improving supply chain security, closing industry skills gaps and driving security awareness among the public, according to the White House.

As reported by Infosecurity yesterday, the Biden administration welcomed the CEOs of Microsoft, Apple, Google, IBM and others to a meeting yesterday to discuss the "whole-of-nation" effort needed to address cybersecurity threats." The result of that encounter has been a series of commitments from these firms, including $10bn from Google over the next five years to expand zero trust and improve supply chain and open source security. The tech giant will apparently also help 100,000 Americans earn "digital skills certificates."

IBM said it would train 150,000 people in cyber skills over the coming three years and focus on improving the diversity of the security workforce, while Microsoft has committed $20bn over five years to drive security by design, and $150m for federal, local and state governments. Apple will establish a new program to improve supply chain security, including among its 9000 US suppliers, with multi-factor authentication (MFA), vulnerability remediation, event logging and incident response all playing a key role. Amazon is making MFA devices available to all AWS customers and rolling out the security training it offers employees to the general public.

Aside from these commitments, the White House announced the expansion of its Industrial Control Systems Cybersecurity Initiative, from the electricity sector to natural gas pipelines, and said the National Institute of Standards and Technology (NIST) would develop a new framework for supply chain security. In another potentially significant move, insurer Resilience said it would require policyholders to meet a threshold of cybersecurity best practice as a condition of receiving coverage — something experts have been demanding for some time across the industry.

NextGov.com also quotes the president's remarks about a cybersecurity executive order issued May 12th: "Because of that order, government will only buy tech products that meet certain cybersecurity standards, which will have a ripple effect across the software industry, in our view, ultimately improving security for all Americans,"
IBM

After 18 Years, SCO's IBM Litigation May Be Settled for $14.5 Million (scribd.com) 151

Slashdot has confirmed with the U.S. Bankruptcy Court for the District of Delaware that after 18 years of legal maneuvering, SCO's bankruptcy case (first filed in 2007) is now "awaiting discharge."

Long-time Slashdot reader rkhalloran says they know the reason: Papers filed 26 Aug by IBM & SCOXQ in U.S. Bankruptcy Court in Delaware for a proposed settlement, Case 07-11337-BLS Doc 1501:

By the Settlement Agreement, the Trustee has reached a settlement with IBM that resolves all of the remaining claims at issue in the Utah Litigation (defined below). The Settlement Agreement is the culmination of extensive arm's length negotiation between the Trustee and IBM.

Under the Settlement Agreement, the Parties have agreed to resolve all disputes between them for a payment to the Trustee, on behalf of the Estates, of $14,250,000. For the reasons set forth more fully below, the Trustee submits the Settlement Agreement and the settlement with IBM are in the best interests of the Estates and creditors, are well within the range of reasonableness, and should be approved.

The proposed order would include "the release of the Estates' claims against IBM and vice versa" (according to this PDF attributed to SCO Group and IBM uploaded to scribd.com). And one of the reasons given for the proposed settlement? "The probability of the ultimate success of the Trustee's claims against IBM is uncertain," according to an IBM/SCO document on Scribd.com titled Trustee's motion: For example, succeeding on the unfair competition claims will require proving to a jury that events occurring many years ago constituted unfair competition and caused SCO harm. Even if SCO were to succeed in that effort, the amount of damages it would recover is uncertain and could be significantly less than provided by the Settlement Agreement. Such could be the case should a jury find that (1) the amount of damage SCO sustained as a result of IBM's conduct is less than SCO has alleged, (2) SCO's damages are limited by a $5 million damage limitation provision in the Project Monterey agreement, or (3) some or all of IBM's Counterclaims, alleging millions of dollars in damages related to IBM's Linux activities and alleged interference by SCO, are meritorious.

Although the Trustee believes the Estates would ultimately prevail on claims against IBM, a not insignificant risk remains that IBM could succeed with its defenses and/or Counterclaims

The U.S. Bankruptcy Court for the District of Delaware told Slashdot that the first meeting of the creditors will be held on September 22nd, 2021.
Businesses

Tim Cook Gets $750 Million Bonus On 10th Anniversary As Apple CEO (cnn.com) 63

Tim Cook celebrated 10 years as Apple CEO by collecting and selling off three quarters of a billion dollars' worth of stock. CNN reports: The transactions were revealed in a regulatory filing Thursday, which showed that Cook had acquired and sold more than 5 million shares of the iPhone maker. As head of the world's most valuable company, Cook has received lofty stock awards in recent years. One of the incentives was tied to Apple's performance in the S&P 500 over the past three years. The stock award was triggered this week because the firm was one of the index's top performers, generating shareholder returns of nearly 192% from August 2018 to 2021, it said in a filing. Cook's windfall came just days after he also donated 70,000 Apple shares (worth about $10 million) to charity, according to a separate regulatory filing Tuesday. It did not disclose the name of the recipient.

Cook joined Apple in 1998 and served in a variety of senior roles before assuming his current position, including chief operating officer and executive vice president of worldwide sales and operations. He was named CEO in August 2011, after co-founder Steve Jobs famously stepped down. Jobs died weeks later from complications of pancreatic cancer. Prior to joining Apple, Cook worked at Compaq and IBM (IBM).

Intel

Intel Inks Deal with Department of Defense To Support Domestic Chip-Building Ecosystem (techcrunch.com) 28

Intel has signed a deal with the Department of Defense to support a domestic commercial chip-building ecosystem. The chipmaker will lead the first phase of a program called Rapid Assured Microelectronics Prototypes - Commercial (RAMP-C), which aims to bolster the domestic semiconductor supply chain. From a report: The chipmaker's recently launched division, Intel Foundry Services, will lead the program. As part of RAMP-C, Intel will partner with IBM, Cadence, Synopsys and others to establish a domestic commercial foundry ecosystem. Intel says the program was designed to create custom integrated circuits and commercial products required by the Department of Defense's systems. "The RAMP-C program will enable both commercial foundry customers and the Department of Defense to take advantage of Intel's significant investments in leading-edge process technologies," said Randhir Thakur, president of Intel Foundry Services, in a statement. "Along with our customers and ecosystem partners, including IBM, Cadence, Synopsys and others, we will help bolster the domestic semiconductor supply chain and ensure the United States maintains leadership in both R&D and advanced manufacturing."
Technology

Mastercard To Become First Payments Network To Phase Out Magnetic Stripe (mastercard.com) 125

Mastercard, writing in a blog post: In the early age of modern credit cards, they had to write down account information for each card-carrying customer by hand. Later, they used flatbed imprinting machines to record the card information on carbon paper packets, the sound of the swiping of the handle earning them the name, zip-zap machines. (They were also dubbed "knuckle-busters" by the unfortunate clerks who skinned their fingers on the embossing plate.) And how could clerks tell whether the customer was good for the purchase? They couldn't. Credit card companies would circulate a list of bad account numbers each month, and the merchant would have to compare the customers' cards against the list.

The arrival of the magnetic stripe changed all that. An early 1960s innovation largely credited to IBM, the magnetic stripe allowed banks to encode card information onto magnetic tape laminated to the back. It paved the way for electronic payment terminals and chip cards, offering more security and real-time authorization while making it easier for businesses of all sizes to accept cards. That thin stripe has remained a fixture on billions of payment cards for decades, even as technology has evolved. But now the magnetic stripe is reaching its expiration date with Mastercard becoming the first payments network to phase it out.

The shift away from the magnetic stripe points to both consumers changing habits for payments and the development of newer technologies. Today's chip cards are powered by microprocessors that are much more capable and secure, and many are also embedded with tiny antennae that enable contactless transactions. Biometric cards, which combine fingerprints with chips to verify a cardholder's identity, offer another layer of security. Based on the decline in payments powered by magnetic stripes after chip-based payments took hold, newly-issued Mastercard credit and debit cards will not be required to have a stripe starting in 2024 in most markets. By 2033, no Mastercard credit and debit cards will have magnetic stripes, which leaves a long runway for the remaining partners who still rely on the technology to phase in chip card processing.

Hardware

Nokia's Smartphone: 25 Years Since it Changed the World (dw.com) 17

The Nokia 9000 Communicator -- "the office in your back pocket" -- was a smartphone even before the word was invented. It has been 25 years since it revolutionized the market. DW: Nokia presented its 9000 Communicator at the CeBIT 1996 computer fair in Hanover, Germany, and launched on August 15 of that year. "The office in your back pocket" added to the IBM Simon from 1994 and the HP OmniGo 700LX from March 1996. The 9000 Communicator was a smartphone even before the word had been invented. For a decade, the device was ââwhat a smartphone was supposed to look like. After the Communicator, Blackberry perfected the idea -- until Apple's iPhone with its multitouch screen in 2007 came along.

Opened like a minilaptop, with a keyboard and a black-and-white display with a diagonal of just 11.5 centimeters (4.5 inches), the retrofuturistic-looking device was made famous by actor Val Kilmer in the remake of the film The Saint. The 9000 Communicator was the first device to offer a combination of keyboard, quality screen, and business and internet software in one package. It had for the first time all of the features of a computer on a phone, putting email, web browsing, fax, word processing and spreadsheets into a single pocketable device.

Hardware

Samsung is Using AI to Design a Smartphone Chip. Will Others Follow? (arstechnica.com) 60

"Samsung is using artificial intelligence to automate the insanely complex and subtle process of designing cutting-edge computer chips," reports Wired: The South Korean giant is one of the first chipmakers to use AI to create its chips. Samsung is using AI features in new software from Synopsys, a leading chip design software firm used by many companies...

Others, including Google and Nvidia, have talked about designing chips with AI. But Synopsys' tool, called DSO.ai, may prove the most far-reaching because Synopsys works with dozens of companies. The tool has the potential to accelerate semiconductor development and unlock novel chip designs, according to industry watchers. Synopsys has another valuable asset for crafting AI-designed chips: years of cutting-edge semiconductor designs that can be used to train an AI algorithm. A spokesperson for Samsung confirms that the company is using Synopsys AI software to design its Exynos chips, which are used in smartphones, including its own branded handsets, as well as other gadgets...

Chipmakers including Nvidia and IBM are also dabbling in AI-driven chip design. Other makers of chip-design software, including Cadence, a competitor to Synopsys, are also developing AI tools to aid with mapping out the blueprints for a new chip.

But Synopsys's co-CEO tells Wired that Samsung's chip will be "the first of a real commercial processor design with AI."
IBM

The IBM PC Turns 40 (theregister.com) 117

The Register's Richard Speed commemorates the 40th anniversary of the introduction of the IBM Model 5150: IBM was famously late to the game when the Model 5150 (or IBM PC) put in an appearance. The likes of Commodore and Apple pretty much dominated the microcomputer world as the 1970s came to a close and the 1980s began. Big Blue, on the other hand, was better known for its sober, business-orientated products and its eyewatering price tags. However, as its customers began eying Apple products, IBM lumbered toward the market, creating a working group that could dispense with the traditional epic lead-times of Big Blue and take a more agile approach. A choice made was to use off-the-shelf hardware and software and adopt an open architecture. A significant choice, as things turned out.

Intel's 8088 was selected over the competition (including IBM's own RISC processor) and famously, Microsoft was tapped to provide PC DOS as well as BASIC that was included in the ROM. So this marks the 40th anniversary of PC DOS, aka MS-DOS, too. You can find Microsoft's old MS-DOS source code here. The basic price for the 5150 was $1,565, with a fully loaded system rising to more than $3,000. Users could enjoy high resolution monochrome text via the MDA card or some low resolution graphics (and vaguely nauseating colors) through a CGA card (which could be installed simultaneously.) RAM landed in 16 or 64kB flavors and could be upgraded to 256kB while the Intel 8088 CPU chugged along at 4.77 MHz.

Storage came courtesy of up to two 5.25" floppy disks, and the ability to attach a cassette recorder -- an option swiftly stripped from later models. There was no hard disk, and adding one presented a problem for users with deep enough pockets: the motherboard and software didn't support it and the power supply was a bit weedy. IBM would resolve this as the PC evolved. Importantly, the motherboard also included slots for expansion, which eventually became known as the Industry Standard Architecture (ISA) bus as the IBM PC clone sector exploded. IBM's approach resulted in an immense market for expansion cards and third party software.
While the Model 5150 "sold like hotcakes," Speed notes that it was eventually discontinued in 1987.
Businesses

These People Who Work From Home Have a Secret: They Have Two Jobs (wsj.com) 168

When the pandemic freed employees from having to report to the office, some saw an opportunity to double their salary on the sly. From a report: They were bored. Or worried about layoffs. Or tired of working hard for a meager raise every year. They got another job offer. Now they have a secret. A small, dedicated group of white-collar workers, in industries from tech to banking to insurance, say they have found a way to double their pay: Work two full-time remote jobs, don't tell anyone and, for the most part, don't do too much work, either. Alone in their home offices, they toggle between two laptops. They play "Tetris" with their calendars, trying to dodge endless meetings. Sometimes they log on to two meetings at once. They use paid time off -- in some cases, unlimited -- to juggle the occasional big project or ramp up at a new gig. Many say they don't work more than 40 hours a week for both jobs combined. They don't apologize for taking advantage of a system they feel has taken advantage of them.

[...] Gig work and outsourcing have been on the rise for years. Inflation is now ticking up, chipping away at spending power. Some employees in white-collar fields wonder why they should bother spending time building a career. "The harder that you work, it seems like the less you get," one of the workers with two jobs says. "People depend on you more. My paycheck is the same." Overemployed says it has a solution. "There's no implied lifetime employment anymore, not even at IBM," writes one of the website's co-founders, a 38-year-old who works for two tech companies in the San Francisco Bay Area. The site serves up tips on setting low expectations with bosses, staying visible at meetings and keeping LinkedIn profiles free of red flags. (A "social-media cleanse" is a solid excuse for an outdated LinkedIn profile, it says.) In a chat on the messaging platform Discord, people from around the world swap advice about employment checks and downtime at various brand-name companies.

Medicine

IBM's AI Can Predict How Parkinson's Disease May Progress In Individuals (engadget.com) 7

An anonymous reader quotes a report from Engadget: [R]esearchers from IBM and Michael J. Fox Foundation (MJFF) say they've developed a program that can predict how the symptoms of a Parkinson's disease patient will progress in terms of both timing and severity. In The Lancet Digital Health journal, they claim the software could transform how doctors help patients manage their symptoms by allowing them to better predict how the disease will progress. The breakthrough wouldn't have been possible without the Parkinson's Progression Markers Initiative, a study the Michael J. Fox Foundation sponsored. IBM describes the dataset, which includes information on more than 1,400 individuals, as the "largest and most robust volume of longitudinal Parkinson's patient data to date" and says it allowed its AI model to map out complex symptom and progression patterns.
Bug

Everyone Cites That 'Bugs Are 100x More Expensive To Fix in Production' Research, But the Study Might Not Even Exist (theregister.com) 118

"Software research is a train wreck," says Hillel Wayne, a Chicago-based software consultant who specialises in formal methods, instancing the received wisdom that bugs are way more expensive to fix once software is deployed. Wayne did some research, noting that "if you Google 'cost of a software bug' you will get tons of articles that say 'bugs found in requirements are 100x cheaper than bugs found in implementations.' They all use this chart from the 'IBM Systems Sciences Institute'... There's one tiny problem with the IBM Systems Sciences Institute study: it doesn't exist." The Register: Laurent Bossavit, an Agile methodology expert and technical advisor at software consultancy CodeWorks in Paris, has dedicated some time to this matter, and has a post on GitHub called "Degrees of intellectual dishonesty". Bossavit referenced a successful 1987 book by Roger S Pressman called Software Engineering: a Practitioner's Approach, which states: "To illustrate the cost impact of early error detection, we consider a series of relative costs that are based on actual cost data collected for large software projects [IBM81]." The reference to [IBM81] notes that the information comes from "course notes" at the IBM Systems Sciences Institute. Bossavit discovered, though, that many other publications have referenced Pressman's book as the authoritative source for this research, disguising its tentative nature.

Bossavit took the time to investigate the existence of the IBM Systems Science Institute, concluding that it was "an internal training program for employees." No data was available to support the figures in the chart, which shows a neat 100x the cost of fixing a bug once software is in maintenance. "The original project data, if any exist, are not more recent than 1981, and probably older; and could be as old as 1967," said Bossavit, who also described "wanting to crawl into a hole when I encounter bullshit masquerading as empirical support for a claim, such as 'defects cost more to fix the later you fix them'."

IBM

What Ever Happened to IBM's Watson? (nytimes.com) 75

After Watson triumphed on the gameshow Jeopardy in 2011, its star scientist had to convince IBM that it wasn't a magic answer box, and "explained that Watson was engineered to identify word patterns and predict correct answers for the trivia game."

The New York Times looks at what's happened in the decade since: Watson has not remade any industries. And it hasn't lifted IBM's fortunes. The company trails rivals that emerged as the leaders in cloud computing and A.I. — Amazon, Microsoft and Google. While the shares of those three have multiplied in value many times, IBM's stock price is down more than 10 percent since Watson's "Jeopardy!" triumph in 2011.... The company's missteps with Watson began with its early emphasis on big and difficult initiatives intended to generate both acclaim and sizable revenue for the company, according to many of the more than a dozen current and former IBM managers and scientists interviewed for this article... The company's top management, current and former IBM insiders noted, was dominated until recently by executives with backgrounds in services and sales rather than technology product experts. Product people, they say, might have better understood that Watson had been custom-built for a quiz show, a powerful but limited technology...

IBM insists that its revised A.I. strategy — a pared-down, less world-changing ambition — is working... But the grand visions of the past are gone. Today, instead of being a shorthand for technological prowess, Watson stands out as a sobering example of the pitfalls of technological hype and hubris around A.I. The march of artificial intelligence through the mainstream economy, it turns out, will be more step-by-step evolution than cataclysmic revolution.

One example: IBM technologists approached cancer medical centers, but "were frustrated by the complexity, messiness and gaps in the genetic data at the cancer center... At the end of last year, IBM discontinued Watson for Genomics, which grew out of the joint research with the University of North Carolina. It also shelved another cancer offering, Watson for Oncology, developed with another early collaborator, the Memorial Sloan Kettering Cancer Center..." IBM continued to invest in the health industry, including billions on Watson Health, which was created as a separate business in 2015. That includes more than $4 billion to acquire companies with medical data, billing records and diagnostic images on hundreds of millions of patients. Much of that money, it seems clear, they are never going to get back. Now IBM is paring back Watson Health and reviewing the future of the business. One option being explored, according to a report in The Wall Street Journal, is to sell off Watson Health...

Many outside researchers long dismissed Watson as mainly a branding campaign. But recently, some of them say, the technology has made major strides... The business side of Watson also shows signs of life. Now, Watson is a collection of software tools that companies use to build A.I.-based applications — ones that mainly streamline and automate basic tasks in areas like accounting, payments, technology operations, marketing and customer service. It is workhorse artificial intelligence, and that is true of most A.I. in business today. A core Watson capability is natural language processing — the same ability that helped power the "Jeopardy!" win. That technology powers IBM's popular Watson Assistant, used by businesses to automate customer service inquiries...

IBM says it has 40,000 Watson customers across 20 industries worldwide, more than double the number four years ago. Watson products and services are being used 140 million times a month, compared with a monthly rate of about 10 million two years ago, IBM says. Some of the big customers are in health, like Anthem, a large insurer, which uses Watson Assistant to automate customer inquiries.

"Adoption is accelerating," Mr. Thomas said.

Open Source

Libre-SOC's Open Hardware 180nm ASIC Submitted To IMEC for Fabrication (openpowerfoundation.org) 38

"We're building a chip. A fast chip. A safe chip. A trusted chip," explains the web page at Libre-SOC.org. "A chip with lots of peripherals. And it's VPU. And it's a 3D GPU... Oh and here, have the source code."

And now there's big news, reports long-time Slashdot reader lkcl: Libre-SOC's entirely Libre 180nm ASIC, which can be replicated down to symbolic level GDS-II with no NDAs of any kind, has been submitted to IMEC for fabrication.

It is the first wholly-independent Power ISA ASIC outside of IBM to go Silicon in 12 years. Microwatt went to Skywater 130nm in March; however, it is also developed by IBM, as an exceptionally well-made Reference Design, which Libre-SOC used for verification.

Whilst it would seem that Libre-SOC is jumping on the chip-shortage era's innovation bandwagon, Libre-SOC has actually been in development for over three and a half years so far. It even pre-dates the OpenLane initiative, and has the same objectives: fully automated HDL to GDS-II, full transparency and auditability with Libre VLSI tools Coriolis2 and Libre Cell Libraries from Chips4Makers.

With €400,000 in funding from the NLNet Foundation [a long-standing non-profit supporting privacy, security, and the "open internet"], plus an application to NGI Pointer under consideration, the next steps are to continue development of Draft Cray-style Vectors (SVP64) to the already supercomputer-level Power ISA, under the watchful eye of the upcoming OpenPOWER ISA Workgroup.

Slashdot Top Deals