×
Patents

Video How the Open Invention Network Protects Linux and Open Source (Video) 28

This is a Google Hangout interview with Keith Bergelt, Chief Executive Officer of the Open Invention Network (OIN), which was jointly founded by IBM, NEC, Novell, Philips, Red Hat, and Sony to share their relevant patents with all Linux and Open Source developers and users in order to prevent patent troll attacks on FOSS, such as the famous SCO vs. IBM lawsuits that hampered Linux adoption during the early 2000s. It costs nothing to become a an OIN licensee, and over 500 companies have done so. Few people know, however, that individual developers and FOSS users can become OIN licensees; that you are welcome to do so, and it costs nothing. Read their license agreement, sign it, and send it in. That's all it takes. They also buy patents and accept patent donations. And "...if your company is being victimized by any entity seeking to assert its patent portfolio against Linux, please contact us so that we can aid you in your battle with these dark forces." This OIN service is called Linux Defenders 911. We hope you never need to use it, but it's good to know it's there if you do need it.
Google

Google Patents Staple of '70s Mainframe Computing 333

theodp writes "'The lack of interest, the disdain for history is what makes computing not-quite-a-field,' Alan Kay once lamented. And so it should come as no surprise that the USPTO granted Google a patent Tuesday for the Automatic Deletion of Temporary Files, perhaps unaware that the search giant's claimed invention is essentially a somewhat kludgy variation on file expiration processing, a staple of circa-1970 IBM mainframe computing and subsequent disk management software. From Google's 2013 patent: 'A path name for a file system directory can be "C:temp\12-1-1999\" to indicate that files contained within the file system directory will expire on Dec. 1, 1999.' From Judith Rattenbury's 1971 Introduction to the IBM 360 computer and OS/JCL: 'EXPDT=70365 With this expiration date specified, the data set will not be scratched or overwritten without special operator action until the 365th day of 1970.' Hey, things are new if you've never seen them before!"
AI

Computers Shown To Be Better Than Docs At Diagnosing, Prescribing Treatment 198

Lucas123 writes "Applying the same technology used for voice recognition and credit card fraud detection to medical treatments could cut healthcare costs and improve patient outcomes by almost 50%, according to new research. Scientists at Indiana University found that using patient data with machine-learning algorithms can drastically improve both the cost and quality of healthcare through simulation modeling.The artificial intelligence models used for diagnosing and treating patients obtained a 30% to 35% increase in positive patient outcomes, the research found. This is not the first time AI has been used to diagnose and suggest treatments. Last year, IBM announced that its Watson supercomputer would be used in evaluating evidence-based cancer treatment options for physicians, driving the decision-making process down to a matter of seconds."
GUI

The History of Visual Development Environments 181

Esther Schindler writes "There was a time when programs were written in text editors. And when competition between C++ vendors was actually fierce. Step into the time travel machine as Andy Patrizio revisits the evolution and impact of the visual development metaphor. 'Visual development in its earliest stages was limited by what the PC could do. But for the IBM PC in the early 1980s, with its single-tasking operating system and 8- or 16-bit hardware, the previous software development process was text edit, compile, write down the errors, and debug with your eyes.' Where do you start? 'While TurboPascal launched the idea of an integrated development environment, [Jeff] Duntemann credits Microsoft's Visual Basic (VB), launched in 1991, with being the first real IDE.'... And yes, there's plenty more." A comment attached to the story lists two IDEs that preceded VB; can you name others?
Education

IBM's Watson Goes To College To Extend Abilities 94

An anonymous reader writes in with news that IBM's Jeopardy winning supercomputer is going back to school"A modified version of the powerful IBM Watson computer system, able to understand natural spoken language and answer complex questions, will be provided to Rensselaer Polytechnic Institute in New York, making it the first university to receive such a system. IBM announced Wednesday that the Watson system is intended to enable upstate New York-based RPI to find new uses for Watson and deepen the systems' cognitive computing capabilities - for example by broadening the volume, types, and sources of data Watson can draw upon to answer questions."
IBM

Stanford Uses Million-Core Supercomputer To Model Supersonic Jet Noise 66

coondoggie writes "Stanford researchers said this week they had used a supercomputer with 1,572,864 compute cores to predict the noise generated by a supersonic jet engine. 'Computational fluid dynamics simulations test all aspects of a supercomputer. The waves propagating throughout the simulation require a carefully orchestrated balance between computation, memory and communication. Supercomputers like Sequoia divvy up the complex math into smaller parts so they can be computed simultaneously. The more cores you have, the faster and more complex the calculations can be. And yet, despite the additional computing horsepower, the difficulty of the calculations only becomes more challenging with more cores. At the one-million-core level, previously innocuous parts of the computer code can suddenly become bottlenecks.'"
AI

IBM's Watson Gets a Swear Filter After Learning the Urban Dictionary 310

redletterdave writes "IBM's super-computer Watson briefly went from smart to smart ass with the help of the Urban Dictionary. According to Eric Brown, an IBM research assistant, he and his 35-person team wanted to get Watson to sound more like a real human. After teaching IBM's super-computer the entire Urban Dictionary, however, Watson simply couldn't distinguish polite discourse from profanity. Watson unfortunately learned all of the Urban Dictionary's bad habits, including throwing in overly-crass language at random points in its responses; in answering one question, Watson even reportedly used the word 'bullshit' within an answer to one researcher's question. In the end, Brown and his team were forced to remove the Urban Dictionary from Watson's vocabulary, and additionally developed a smart filter to keep Watson from swearing in the future."
Google

2012 Patent Rankings: IBM On Top, Google Spikes 41

bednarz writes "It's official: IBM has dominated the U.S. patent race for two decades. IBM earned 6,478 utility patents last year, topping the list of patent winners for the 20th year in a row, according to data published today from IFI CLAIMS Patent Services. Samsung was the second most prolific patent winner, with 5,081 patents received in 2012, followed by Canon (3,174), Sony (3,032), Panasonic (2,769), Microsoft (2,613), Toshiba (2,447), Hon Hai Precision Industry (2,013), GE (1,652), and LG Electronics (1,624). Earning its first appearance among the top 50, Google increased its 2012 patent count by 170% to 1,151 patents and landed at 21 in IFI's rankings, up from 65 in 2011. Google narrowly beat Apple, which earned 1,136 patents (an increase of 68%) and landed at 22 in the rankings."
IBM

Change the ThinkPad and It Will Die 347

ErichTheRed writes "Here's an interesting editorial piece about the ThinkPad over at CNN. It mirrors what many ThinkPad devotees have been saying since Lenovo started tweaking the classic IBM design to make the ThinkPad more like a MacBook, Sony or other high-end consumer device. I'm a big fan of these bulletproof, decidedly unsexy business notebooks, and would be unhappy if Lenovo decided to sacrifice build quality for coolness. Quoting: 'Before doing anything drastic, Lenovo would be wise to review the spectacular rise and fall of Blackberry-maker Research in Motion. The mobile handset manufacturer tried to take on Apple by launching a number of products aimed at the retail consumer after the launch of the iPhone. It released the devastatingly bad Blackberry Storm as a response to the iPhone and later the Playbook to take on the iPad. The Storm failed because it was hastily put together in a mad dash and lacked the signature Blackberry QWERTY keyboard ... The Playbook failed because the Blackberry ecosystem had at the point of its launched more or less collapsed, making the Playbook just another iPad clone no one wanted. Meanwhile, the original Blackberry was left to wither away as the company focused on chasing Apple and wasn't updated in a meaningful way, making it look just old and tired.'"
Books

Book Review: Burdens of Proof 70

benrothke writes "When the IBM PC first came out 31 years ago, it supported a maximum of 256KB RAM. You can buy an equivalent computer today with substantially more CPU power at a fraction of the price. But in those 31 years, the information security functionality in which the PC operates has not progressed accordingly. In Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents, author Jean-François Blanchette observes that the move to a paperless society means that paper-based evidence needs to be recreated in the digital world. It also requires an underlying security functionality to flow seamlessly across organizations, government agencies and the like. While the computing power is there, the ability to create a seamless cryptographic culture is much slower in coming." Keep reading for the rest of Ben's review.
Image

Book Review: Terrible Nerd Screenshot-sm 66

tgeller writes "It's hard to believe that today's nerdier children will one day bore their grandkids with stories of primitive mobile access, household robotics, and 3-D printers. Some will become rich and famous by latching onto tomorrow's winners; others will find themselves irrelevant as the objects of their obsessions fail in the marketplace. But all with the energy to remember will come away with stories from the dawn of creation. One such witness is Kevin Savetz, a 41-year-old technology journalist and entrepreneur whose new book Terrible Nerd recounts 'true tales of growing up geek' during the '80s computer revolution. It's a rich chronicle that deftly mixes details of his beloved technologies with the zeitgeist a particular time and space. As such, it's an entertaining read for technologists and non-techies alike." Keep reading for the rest of tgeller's review.
Christmas Cheer

Ask Slashdot: Gifts For a 90-Year-Old, Tech-Savvy Dad? 211

First time accepted submitter Bearhouse writes "My Dad amazes me with (a) his longevity & energy, and (b) his continued ability to mess around with electronics stuff. Since he already has things ranging from valve amps made from war-surplus, via an original IBM PC kit to an Android tablet, I was going to buy him a Raspberry Pi for Christmas. Turns out he's already got one. I saw nothing that really got me excited here, so your ideas would be appreciated."
IBM

IBM Predicts the Next 5 Years of Computing 93

A reader writes "Shaun McGlaun of Slashgear writes: IBM has offered up its annual list of five innovations that will change our lives within five years. IBM calls the list the 'IBM 5 in 5.' The list covers innovations that IBM believes that the potential change the way people work, live, and interact over the next five years. The five innovations IBM lists this year include touch, sight, hearing, taste, and smell. "
IBM

IBM Creates Commercially Viable, Electronic-Photonic Integrated Chip 71

An anonymous reader writes "After more than a decade of research, and a proof of concept in 2010, IBM Research has finally cracked silicon nanophotonics (or CMOS-integrated nanophotonics, CINP, to give its full name). IBM has become the first company to integrate electrical and optical components on the same chip, using a standard 90nm semiconductor process. These integrated, monolithic chips will allow for cheap chip-to-chip and computer-to-computer interconnects that are thousands of times faster than current state-of-the-art copper and optical networks. Where current interconnects are generally measured in gigabits per second, IBM's new chip is already capable of shuttling data around at terabits per second, and should scale to peta- and exabit speeds."
It's funny.  Laugh.

Boring Conference Still Vows: We Will Not Rock You 42

An anonymous reader writes "The Boring 2012 Conference, the celebration of unexciting things served with dry British humor, now belongs to the wastepaper basket of history. Correspondents at the third annual London conference report that speakers covered a range of such dull topics as supermarket self-service checkouts; a photographic survey of results produced by breakfast toasters; a web site tracking the physical heights of celebrities; and the use of Google Maps to the chart the location of IBM cash registers around London." Funny thing is, the talks described actually sound fascinating.
Cloud

Video How Can Linux Gain (Even) More Enterprise Acceptance? (Video) 177

This is what we asked Jason Perlow. He wrote a Linux Magazine column for many years and now writes for ZDNet. The ZDNet blurb describes him as "a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies." Most recently, he worked for IBM, and for Unisys before that. So Jason knows plenty about Linux and its role in big-time enterprise computing. In this video, he talks about how Linux needs to take another step forward to gain even more enterprise traction in coming years.
IBM

In a Symbolic Shift, IBM's India Workforce Likely Exceeds That In US 491

dcblogs writes "IBM has 112,000 employees in India, up from 6,000 in 2002, with an average wage of about $17,000, according to an internal company document. That wage level may seem shockingly low to U.S. IT workers, but it is in alignment with IT wages in India.The Everest Group said the annual wages generally in India for a software engineer range from $8,000 to $10,000; for a senior software engineer, $12,000 to $15,000, and between $18,000 and $20,000 for a team lead. A project manager may make as much as $31,000. IBM employs about 430,000 globally. According to the Alliance at IBM, the U.S. staff is at about 92,000. It was at 121,000 at the end of 2007, and more in previous years. It has been widely expected over the past year or two that IBM's India workforce was on track to exceed its U.S. workforce, if it hadn't exceeded it already."
Handhelds

Nintendo Wii U Teardown Reveals Simple Design 276

Vigile writes "Nintendo has never been known to be very aggressive with its gaming console hardware and with today's release (in the U.S.) of the Wii U we are seeing a continuation of that business model. PC Perspective spent several hours last night taking apart a brand new console to reveal a very simplistic board and platform design topped off with the single multi-chip module that holds the IBM PowerPC CPU and the AMD GPU. The system includes 2GB of GDDR3 memory from Samsung and Foxconn/Hon-Hai built wireless controllers for WiFi and streaming video the gamepad. Even though this system is five years newer, many analysts estimate the processing power of Nintendo's Wii U to be just ahead of what you have in the Xbox 360 today."
Patents

Patent System Not Broken, Argues IBM's Chief Patent Counsel 152

New submitter TurinX writes "Unsurprisingly, IBM's Chief Patent Counsel, Manny Schecter, thinks the patent system isn't broken. He says, 'Patent disputes like [the Apple-Samsung case] are a natural characteristic of a vigorously competitive industry. And they're nothing new: Similar skirmishes have historically occurred in areas as diverse as sewing machines, winged flight, agriculture, and telegraph technology. Each marked the emergence of incredible technological advances, and each generated similar outcries about the patent system. We are actually witnessing fewer patent suits per patent issued today than the historical average.'" Regarding software patents, he argues, "If patent litigation caused by the U.S. patent system stifled innovation, U.S. software companies would not be the most successful in the world." His recommendation is that we should be patient and "let the system work." Schecter's editorial at Wired is one of a series of expert opinions on the patent system; we've already discussed Richard Stallman's contribution.
Intel

Cray Unveils XC30 Supercomputer 67

Nerval's Lobster writes "Cray has unveiled a XC30 supercomputer capable of high-performance computing workloads of more than 100 petaflops. Originally code-named 'Cascade,' the system relies on Intel Xeon processors and Aries interconnect chipset technology, paired with Cray's integrated software environment. Cray touts the XC30's ability to utilize a wide variety of processor types; future versions of the platform will apparently feature Intel Xeon Phi and Nvidia Tesla GPUs based on the Kepler GPU computing architecture. Cray leveraged its work with DARPA's High Productivity Computing Systems program in order to design and build the XC30. Cray's XC30 isn't the only supercomputer aiming for that 100-petaflop crown. China's Guangzhou Supercomputing Center recently announced the development of a Tianhe-2 supercomputer theoretically capable of 100 petaflops, but that system isn't due to launch until 2015. Cray also faces significant competition in the realm of super-computer makers: it only built 5.4 percent of the systems on the Top500 list, compared to IBM with 42.6 percent and Hewlett-Packard with 27.6 percent."

Slashdot Top Deals