"The encouraging news is that 59% recognize that they are still maturing, indicating that they do not intend to plateau where they are." And agile adoption does appear to be growing. "25% of the respondents say that all or almost all of their teams are agile, whereas only 8% reported that in 2016."
The researchers also note "the recognized necessity of accelerating the speed of delivery of high-quality software, and the emphasis on customer satisfaction," with 71% of the survey respondents reporting that a DevOps initiative is underway or planned for the next 12 months.
Linus Torvalds was able to leverage the enthusiasm of the Internet to make Linux exist, but 1990 was a more innocent time. How does it work today? Any thoughts?
Or, to put it another way, how can you build a community to bring your ideas to light? Leave your best thoughts and suggestions in the comments. How can you make your own vaporware real?
Some venerable names that have stood the test of time: The Love Bug, for the worm that attacked millions of Windows personal computers in 2000, and Y2K, a turn-of-the-century programming scare that didn't live up to its hype. Many names tend more toward geekspeak. The title of hacker magazine 2600 is a tip of the hat to 2600 hertz, the frequency old-school hackers reproduced to trick AT&T phone lines into giving them free calls. Computer worm Conficker is an amalgam of "configure" and a German expletive. Code Red is named after the Mountain Dew drink researchers guzzled while investigating the worm.
Forty years later the situation does not appear to have changed. Target, Equifax, ransomware, etc. show pathetically bad IT design and operation. Why does this pattern of underinvestment in and under-appreciation of IT continue?
Long-time Slashdot reader dheltzel argues that the problem is actually bad hiring practices, which over time leads to lower-quality employees. But it seems like Slashdot's readership should have their own perspective on the current state of the modern workplace.
So share your own thoughts and experiences in the comments. Are companies under-investing in IT?
To improve performance further Ruby is introducing JIT (Just-In-Time) technology, which is already used by JVM and other languages. "We've created a prototype of this JIT compiler so that this year, probably on Christmas Day, Ruby 2.6 will be released," Matz confirmed. You can try the initial implementation of the MJIT compiler in the 2.6 preview1... Probably the clearest overview explanation of how MJIT works is supplied by Shannon Skipper: "With MJIT, certain Ruby YARV instructions are converted to C code and put into a .c file, which is compiled by GCC or Clang into a .so dynamic library file. The RubyVM can then use that cached, precompiled native code from the dynamic library the next time the RubyVM sees that same YARV instruction.
Ruby creator Yukihiro Matsumoto says Ruby 3.0 "has a goal of being three times faster than Ruby 2.0," and TechRadar reports that it's obvious that Matsumoto "will do anything he can to enable Ruby to survive and thrive..."
And in addition, "he's thoroughly enjoying himself doing what he does... and his outlook is quite simple: Programming is fun, he's had fun for the last 25 years making Ruby, and at the age of 52 now, he hopes that he'll get to spend the next 25 years having as much fun working on the language he dreamt up and wrote down in -- a now lost -- notebook, at the age of 17."
"We want Ruby to be the language that is around for a long time and people still use," Matsumoto tells another interviewer, "not the one people used to use."
What do you think and what are your recent experiences with exams at universities? Is this still standard? What's the point besides annoying students? Did I miss something?
A similar question was asked on Slashdot 16 years ago -- but apparently nothing has changed since 2002.
Leave your best answers in the comments. Should coding exams be given on paper?
When was the last time you had an outage because the UI didn't work right? I can't count the number of outages resulting from inexperienced developers introducing a bug in the business logic or middle tier. Am I correct in assuming that the shops that are always looking for full stack developers just aren't grown up yet?
sjames (Slashdot reader #1,099) responded that "They are a thing, but in order to have comprehensive experience in everything involved, the developer will almost certainly be older than HR departments in 'the valley' like to hire."
And Dave Ostrander argues that "In the last 10 years front end software development has gotten really complex. Gulp, Grunt, Sass, 35+ different mobile device screen sizes and 15 major browsers to code for, has made the front end skillset very valuable." The original submitter argues that front-end development "is a much simpler domain," leading to its own discussion.
Share your own thoughts in the comments. Are "full-stack" developers a thing?
The dispute is over pre-written directions known as application program interfaces, or APIs, which can work across different types of devices and provide the instructions for things like connecting to the internet or accessing certain types of files. By using the APIs, programmers don't have to write new code from scratch to implement every function in their software or change it for every type of device. The case has divided Silicon Valley for years, testing the boundaries between the rights of those who develop interface code and those who rely on it to develop software programs.
etcd is a database used by computing clusters to store and exchange passwords and configuration settings between servers and applications over the network. With the default settings, its programming interface can return administrative login credentials without any authentication upfront... All of the data he harvested from around 1500 servers is around 750MB in size... Collazo advises that anyone maintaining etcd servers should enable authentication, set up a firewall, and take other security measures.
Another security research independently verified the results, and reported that one MySQL database had the root password "1234".
The six-month feature release cadence is meant to reduce the latency between major releases, explained is Sharat Chander, director of Oracle's Java SE Product Management group, said in a blog post. "This release model takes inspiration from the release models used by other platforms and by various operating-system distributions addressing the modern application development landscape," Chander wrote. "The pace of innovation is happening at an ever-increasing rate and this new release model will allow developers to leverage new features in production as soon as possible. Modern application development expects simple open licensing and a predictable time-based cadence, and the new release model delivers on both."
This release finally adds var to the Java language (though its use is limited to local variables with initializers or declared in a for-loop). It's being added "to improve the developer experience by reducing the ceremony associated with writing Java code, while maintaining Java's commitment to static type safety, by allowing developers to elide the often-unnecessary manifest declaration of local variable type."
Among the part-time bot-hunters is French security researcher and freelance Android developer Baptiste Robert, who in February of this year noticed that Twitter accounts with profile photos of scantily clad women were liking his tweets or following him on Twitter. Aside from the sexually suggestive images, the bots had similarities. Not only did these Twitter accounts typically include profile photos of adult actresses, but they also had similar bios, followed similar accounts, liked more tweets than they retweeted, had fewer than 1,000 followers, and directed readers to click the link in their bios.
"It is probably the most competitive market in the last 20 years that I have been doing this," said Desikan Madhavanur, chief development officer at Scottsdale, Arizona-based JDA Software, whose products help companies manage supply chains. "We have to compete better to get our fair share." What's happening in the market for software engineers may help illustrate why one of the tightest American labor markets in decades isn't leading to broader wage gains. While technology firms are looking at compensation, they are also finding ways to create the supply of workers themselves, which helps hold costs down.
La Forge fed Commit Assistant with roughly ten years' worth of code from across Ubisoft's software library, allowing it to learn where mistakes have historically been made, reference any corrections that were applied, and predict when a coder may be about to write a similar bug. "It's all about comparing the lines of code we've created in the past, the bugs that were created in them, and the bugs that were corrected, and finding a way to make links [between them] to provide us with a super-AI for programmers," explains Jacquier.