Wall also touched on the long delay for the release of Perl 6. "In the year 2000, we said 'Maybe it's time to break backward compatibility, just once. Maybe we can afford to do that, get off the worse-is-worse cycle, crank the thing once for a worse-is-better cycle." The development team received a whopping 361 suggestions -- and was also influenced by Paul Graham's essay on the 100-year language. "We put a lot of these ideas together and thought really hard, and came up with a whole bunch of principles in the last 15 years." Among the pithy principles: "Give the user enough rope to shoot themselves in the foot, but hide the rope in the corner," and "Encapsulate cleverness, then reuse the heck out of it.."
But Wall emphasized the flexibility and multi-paradigm nature that they finally implemented in Perl 6. "The thing we really came up with was... There really is no one true language. Not even Perl 6, because Perl 6 itself is a braid of sublanguages -- slangs for short -- and they interact with each other, and you can modify each part of the braid..."
Wall even demoed a sigil-less style, and argued that Perl 6 was everything from "expressive" and "optimizable" to "gradually-typed" and "concurrency aware," while supporting multiple virtual machines. He also notes that Perl 6 borrows powerful features from other languages, including Haskell (lazy evaluation) Smalltalk (traits), Go (promises and channels), and C# (functional reactive programming).
And towards the end of the interview Wall remembers how the original release of Perl was considered by some as a violation of the Unix philosophy of doing one thing and doing it well. "I was already on my rebellious slide into changing the world at that point."
The researchers concentrated on posts relevant to Java security, from both software engineering and security perspectives, and on posts addressing questions tied to Spring Security, a third-party Java framework that provides authentication, authorization and other security features for enterprise applications... Developers are frustrated when they have to spend too much time figuring out the correct usage of APIs, and often end up choosing completely insecure-but-easy fixes such as using obsolete cryptographic hash functions, disabling cross-site request forgery protection, trusting all certificates in HTTPS verification, or using obsolete communication protocols. "These poor coding practices, if used in production code, will seriously compromise the security of software products," the researchers pointed out.
The researchers blame "the rapidly increasing need for enterprise security applications, the lack of security training in the software development workforce, and poorly designed security libraries." Among their suggested solutions: new developer tools which can recognize security errors and suggest patches.
Read on for a trip down memory lane.
Update: Slashdot founder CmdrTaco has taken to Medium with some of his own Slashdot nostalgia.
Now that I'm infected with the idea of Open Source hardware, I'm wondering if the Slashdot community could suggest a few more things to get for a beginner in electronics with experience in programming and a basic understanding of machine learning methods. I was looking at the OpenBCI project [Open Brain Computer Interface], which seems like an interesting piece of hardware, but because of the steep price tag and the lack of reviews or blog posts on the internet, I decided to look for something else.
Leave your best answers in the comments. What's the best open source hardware to tinker with?
Chandan Sen, a physiologist at Ohio State, and his colleagues developed a genetic cocktail that rapidly converts skin cells into endothelial cells -- the main component of blood vessels. They then used their technique on mice whose legs had been damaged by a severed artery that cut off blood supply. New blood vessels formed, blood flow increased, and after three weeks the legs had completely healed.
An anonymous Slashdot reader reports that "This challenge sat around, gathering upvotes but no answer, for four years. Then, it was answered." Citing the work of seven contributors, a massive six-part response says their solution took one and a half years to create, and "began as a quest but ended as an odyssey." The team created their own assembly language, known as QFTASM (Quest for Tetris Assembly) for use within Conway's mathematical universe, and then also designed their own processor architecture, and eventually even a higher-level language that they named COGOL. Their StackExchange response includes a link to all of their code on GitHub, as well as to a page where you can run the code online.
One StackExchange reader hailed the achievement as "the single greatest thing I've ever scrolled through while understanding very little."
Developers who mistyped the package name loaded the malicious libraries in their software's setup scripts. "These packages contain the exact same code as their upstream package thus their functionality is the same, but the installation script, setup.py, is modified to include a malicious (but relatively benign) code," NBU explained. Experts say the malicious code only collected information on infected hosts, such as name and version of the fake package, the username of the user who installed the package, and the user's computer hostname. Collected data, which looked like "Y:urllib-1.21.1 admin testmachine", was uploaded to a Chinese IP address. NBU officials contacted PyPI administrators last week who removed the packages before officials published a security advisory on Saturday."
The advisory lays some of the blame on Python's 'pip' tool, which executes arbitrary code during installations without requiring a cryptographic signature.
Ars Technica also reports that another team of researchers "was able to seed PyPI with more than 20 libraries that are part of the Python standard library," and that group now reports they've already received more than 7,400 pingbacks.
- Mashable: "Physically, it's expected to be about the same size as an iPhone 7, but with an edge-to-edge OLED display that's bigger than what is currently on the iPhone 7 Plus. It won't have a home button or Touch ID, and will likely use some kind of facial recognition tech to unlock."
- MacRumors cites a report from KGI Securities analyst Ming-Chi Kuo suggesting facial recognition may just be one feature of a complex front camera with 3D sensing hardware, including a proximity sensor, ambient light sensor, and a structured light transmitter (using a surface-emitting laser) and receiver.
- CNET: "Irish iPhone programming guru Steve Troughton-Smith now feels sure he has the names of the three phones to be launched by Apple on Tuesday.... they'll (probably) be called the iPhone 8, iPhone 8 Plus and -- ta-da -- the iPhone X."
- Troughton-Smith also predicts a 3x screen at 1125x2436 resolution
- Fortune: "Apple's iPhone line is expected to catch up with Android phones in the area of wireless charging this year... just lay the phone down on a compatible charger mat or base or dock, and watch the battery fill up."
- 9to5Mac: "We've found a brand new feature called 'Animoji', which uses the 3D face sensors to create custom 3D animated emoji based on the expressions you make into the camera. Users will be able to make Animoji of unicorns, robots, pigs, pile of poo and many more."
Does Python show a similar growth in the rest of the world, in countries like India, Brazil, Russia and China? Indeed it does. Outside of high-income countries Python is still the fastest growing major programming language; it simply started at a lower level and the growth began two years later (in 2014 rather than 2012). In fact, the year-over-year growth rate of Python in non-high-income countries is slightly higher than it is in high-income countries... We're not looking to contribute to any "language war." The number of users of a language doesn't imply anything about its quality, and certainly can't tell you which language is more appropriate for a particular situation. With that perspective in mind, however, we believe it's worth understanding what languages make up the developer ecosystem, and how that ecosystem might be changing. This post demonstrated that Python has shown a surprising growth in the last five years, especially within high-income countries.
The post was written by Stack Overflow data scientist David Robinson, who notes that "I used to program primarily in Python, though I have since switched entirely to R."
Apple's upcoming iOS 11 once again demonstrates how far ahead of its time WebOS really was. The yet-to-be-released Apple mobile system has essentially copied the WebOS model for switching apps by having the user swipe upward from the bottom to reveal several "cards" that represent background applications. While Apple's decision to remove its massively overworked Home button is an improvement, it is still an inferior way of switching apps, compared to what you could do on WebOS eight years ago.
On mobile, where the majority of the world's content is now consumed, Google and Facebook own eight of the top 10 apps, with apps devouring 87% of our time spent on smartphones and tablets, according to new comScore data. For that remaining 13% of time spent on the mobile web, Google and Apple offer the two dominant browsers... the majority of our time online is now mediated by just a few megacorporations, and for the most part their top incentive is to borrow our privacy just long enough to target an ad at us. Then there's Mozilla, an organization whose mantra is "Internet for people, not profit." That feels like a necessary voice to add to today's internet oligopoly, but it's not one we're hearing... We clearly need an organization standing up for web freedom, as expecting Google to do that is like asking the fox to guard the henhouse. Google does many great things, but its clear incentive is to sell ads. We are Google's product, as the saying goes.
The article applauds the Mozilla-sponsored Rust programming language as promising, "but not to save the web from the all-consuming embrace of Facebook and Google, especially as they wall off the experience in apps... "If I sound like I don't know what to propose Mozilla should do, it's because I don't. I simply feel strongly that the role Mozilla played in the early browser wars needs to be resurrected to save the web today."
In one sense, the failure of coding bootcamps reflects the near-universal failure of for-profit universities, colleges, and charter schools to provide a usable education. In another sense, though, coding bootcamps represent a profound misunderstanding of what computer programming is all about... Coding at the professional level is highly specialized and requires years of practice to master... the idea of a bootcamp for coding is just as practical as the idea of a bootcamp for surgery.
So what comes next? The future of application development depends on using artificial intelligence within the continuous delivery model... We're at the precipice of a new world of AI-aided development that will kick software deployment speeds -- and therefore a company's ability to compete -- into high gear. "AI can improve the way we build current software," writes Diego Lo Giudice of Forrester Research in a recent report. "It will change the way we think about applications -- not programming step by step, but letting the system learn to do what it needs to do -- a new paradigm shift." The possibilities are limited only by our creativity and the investment organizations are willing to make.
The article was written by the head of R&D at Rainforest QA, which is already using AI to manage their crowdsourced quality assurance testing. But he ultimately predicts bigger roles for AI in continuous delivery development -- even choosing which modifications to use in A/B testing, and more systematic stress-testing.