An anonymous reader writes: MIT Technology Review reports, "Like other federal agencies, the NSA is compelled by law to try to commercialize its R&D. It employs patent attorneys and has a marketing department that is now trying to license inventions
... The agency claims more than 170 patents ... But the NSA has faced severe challenges trying to keep up with rapidly changing technology. ... Most recently, the NSA’s revamp included a sweeping effort to dismantle ... “stovepipes,” and switch to flexible cloud computing ... in 2008, NSA brass ordered the agency’s computer and information sciences research organization to create a version of the system Google uses to store its index of the Web and the raw images of Google Earth. That team was led by Adam Fuchs, now Sqrrl’s chief technology officer. Its twist on big data was to add “cell-level security,” a way of requiring a passcode for each data point ... that’s how software (like the infamous PRISM application) knows what can be shown only to people with top-secret clearance. Similar features could control access to data about U.S. citizens. “A lot of the technology we put [in] is to protect rights,” says Fuchs. Like other big-data projects, the NSA team’s system, called Accumulo, was built on top of open-source code because “you don’t want to have to replicate everything yourself,” ... In 2011, the NSA released 200,000 lines of code to the Apache Foundation. When Atlas Venture’s Lynch read about that, he jumped—here was a technology already developed, proven to work on tens of terabytes of data, and with security features sorely needed by heavily regulated health-care and banking customers. ... Eventually, Fuchs and several others left the NSA, and now their company is part of a land grab in big data ..."