Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Government Australia Canada Programming Security United Kingdom

Government Cybersecurity Agencies Unite to Urge Secure Software Design Practices (cisa.gov) 38

Several government cybersecurity agencies united to urge secure-by-design and secure-by-default software. Releasing "joint guidance" for software manufactuers were two U.S. security agencies — the FBI and the NSA — joined with the U.S. Cybersecurity and Infrastructure Security Agency and the cybersecurity authorities of Australia, Canada, the United Kingdom, Germany, Netherlands, and New Zealand. "To create a future where technology and associated products are safe for customers," they wrote in a joint statement, "the authoring agencies urge manufacturers to revamp their design and development programs to permit only secure-by-design and -default products to be shipped to customers."

The Washington Post reports: Software manufacturers should put an end to default passwords, write in safer programming languages and establish vulnerability disclosure programs for reporting flaws, a collection of U.S. and international government agencies said in new guidelines Thursday. [The guidelines also urge rigorous code reviews.]

The "principles and approaches" document, which isn't mandatory but lays out the agencies' views on securing software, is the first major step by the Biden administration as part of its push to make software products secure as part of the design process, and to make their default settings secure as well. It's part of a potentially contentious multiyear effort that aims to shift the way software makers secure their products. It was a key feature of the administration's national cybersecurity strategy, which was released last month and emphasized shifting the burden of security from consumers — who have to manage frequent software updates — to the companies that make often insecure products... The administration has also raised the prospect of legislation on secure-by-design and secure-by-default, but officials have said it could be years away....

The [international affairs think tank] Atlantic Council's Cyber Statecraft Initiative has praised the Biden administration's desire to address economic incentives for insecurity. Right now, the costs of cyberattacks fall on users more than they do tech providers, according to many policymakers. "They're on a righteous mission," Trey Herr, director of the Atlantic Council initiative, told me. If today's guidelines are the beginning of the discussion on secure-by-design and secure-by-default, Herr said, "this is a really strong start, and an important one."

"It really takes aim at security features as a profit center," which for some companies has led to a lot of financial growth, Herr said. "I do think that's going to rub people the wrong way and quick, but that's good. That's a good fight."

In the statement CISA's director says consumers also have a role to play in this transition. "As software now powers the critical systems and services we collectively rely upon every day, consumers must demand that manufacturers prioritize product safety above all else."

Among other things, the new guidelines say that manufacturers "are encouraged make hard tradeoffs and investments, including those that will be 'invisible' to the customers, such as migrating to programming languages that eliminate widespread vulnerabilities."
This discussion has been archived. No new comments can be posted.

Government Cybersecurity Agencies Unite to Urge Secure Software Design Practices

Comments Filter:
  • Does not sound like a pipe dream but apparently it is. Luckily, we are the government and are here to help.
    • by Anonymous Coward

      write in safer programming languages

      Don't blame the tool, blame the person using it. No programming language is "safe" if the person using it is incompetent.

    • 1. Improve the network design. Each application shall have it's own compartment.
      2. Improve the overall software architecture. Never expose the database engine, always use a business layer fronting the database that only offers the business operations you permit - with authentication and authorization based on user and client app.
      3. Use programming languages with strong static typing.
      4. Use compiling languages and never deploy to production testing until you can do a full build with no warnings on strictest

  • They've written so many guidelines and they've yet to be included in the list of memory safe languages, it's just not fair.

    Time to write a few more I guess.

    • by The Evil Atheist ( 2484676 ) on Saturday April 15, 2023 @09:44PM (#63452800)
      Use Rust. That would surely solve the problem of manufacturers using default passwords, or storing passwords in plaintext, or unsanitized inputs from the web.

      Rust is MAGIC! You don't have to understand security anymore, if you know Rust. You only need three years of self-taught Rust experience to be an expert security researcher.
    • For bulk coding I'm using Java or C# to avoid shortcomings of C/C++.

      Then I like to use C for precision programming since that code is easier to read than C++. Some C++ code seems like it's a contender for an obfuscated C contest.

  • by Ol Olsoc ( 1175323 ) on Saturday April 15, 2023 @05:48PM (#63452406)
    "It really takes aim at security features as a profit center," which for some companies has led to a lot of financial growth, Herr said. "I do think that's going to rub people the wrong way and quick, but that's good. That's a good fight."

    The cost center idiocy is one thing that really needs to change. So there is that.

    And then, there is this "cloud" thingy - remember when the cloud was going to be the do all of computer security?

    Today, companies are giving every bit of data away to the bad guys, while whining about Grandma and her "Password1"

    If you are online, you are pwnd. It's an embarrassment of riches for the bad guys. So many Credit card numbers and passwords that it has become a sort of weird back assward security by obscurity. With a few billion card and PWs to choose from, it's just become random.

    "Goddamned Grandma anyway!" - businesses probably.

  • The "Atlantic Council's Cyber Statecraft Initiative" is full of pathetic tokens who would not know a buffer overflow if it pissed on them.

    If they want to be taken seriously they need to stop their discriminatory diversity-balanced shit and start again with real programmers, who have done real security work of note, and are not just greasy-pole climbers who have the right genitalia, skintone and (claimed) ethnic background.
  • > consumers must demand that manufacturers prioritize product safety above all else.

    Oh, is that all?

    Did somebody get paid to write this?

  • Somebody needs to send that report to everybody on the C standards committee. The latest draft is allegedly an absolute shitshow that does nothing to address memory safety and actually breaks the standard library in a way that will cause more issues. They're going backwards in terms of security. C desperately needs a way to securely encapsulate NULL and unbounded pointer dereferencing, or Rust is going to slowly but surely drink its milkshake.

    Their recommended list of memory safe languages? C-sharp, Rust

    • by gweihir ( 88907 )

      Memory safety is a side issue. Software written in memory safe languages does not have fewer security issues. They are just harder to find.

      • Google [grammatech.com] and Microsoft [zdnet.com] disagree. It's interesting that two companies with presumably very different code converged on similar numbers--about 70%. Yes, the articles are a few years old but I doubt much has changed.

        • Google and Microsoft disagree. It's interesting that two companies with presumably very different code converged on similar numbers--about 70%. Yes, the articles are a few years old but I doubt much has changed.

          From all the CISA Email blasts the vast majority of vulnerabilities take the form of various injection attacks (SQLi, XSS...etc)

          If you check out the CVE databases by type..
          https://www.cvedetails.com/vul... [cvedetails.com]

          Nothing even remotely approaching 70% are memory related.

          Over 90% of exploits compromise people not systems. Out of at best remaining 10% of security incidents that are related to software defects even if the 70% was representative of the industry which it most certainly is not effect of doing away with

          • Over 90% of exploits compromise people not systems

            On that point, we're in agreement; but you can't fix people as easily as you fix software so you tend to go for what actually seems possible.

            We *could* try to educate people about secure practices, based on our highly successful endeavor of wiping out HIV which can only be contracted through the exchange of....oh... wait. Nevermind.

          • by gweihir ( 88907 )

            Indeed. It should be added that a lot of the 90% compromising people are still software design problems, e.g. making it far too easy to execute malware that came with an email. And all these have absolutely nothing to do with memory safety. The whole idea that "fixing" memory safety will make a big difference is bogus. What will make a big difference is better coders, regardless of language used.

            The only reason why memory safety is to often wrongly classified as a big problem is because a large number of pr

        • by gweihir ( 88907 )

          Google and Microsoft are lying. By misdirection but its still lies. Counting metrics are meaningless in this space.

      • Those harder to find issues are there regardless of language and don't just appear because you use a memory safe language.

        Many of those issues are architectural issues.

        • by gweihir ( 88907 )

          I would argue that memory unsafe designs have less of them, because developper quality is higher.

    • You're kind of asking for the impossible. Other recent issues aside, you can't just "fix" C without completely breaking backwards compatibility. C++ has made a lot of forward progress, but even then, it's what I tend to call "opt-in security", because they can't remove the older unsafe methods, like manual memory management, or other legacy stuff inherited from C (whose backward compatibility is still an advantage sometimes), etc.

      Also, not every application requires top-tier performance. Rust is known to

    • Yeah, we've all read that FUD piece.

      This is what someone who actually understands what they're talking about says about that FUD piece: https://davmac.wordpress.com/2... [wordpress.com]
  • by sudonim2 ( 2073156 ) on Saturday April 15, 2023 @06:21PM (#63452466)

    If you want good, secure code you need to either extend the time between product cycle to give time to properly code software instead of this constant crunch bullshit or pay a lot more for both more coders and more coders with significantly more training and expertise that you pay for them to develop in-house. There's no other way around this. Excuses like AI or Agile are just overly complex kludges that will always fail to produce results. Until the powers that be acknowledge, or are forced to adhere to, the fact that quality software requires both time and money, we will be burdened with fundamentally unfit code.

    • Excuses like AI or Agile....

      While LLM's are vastly overhyped, they could be useful tools for drudge work like writing comprehensive unit tests. But only once their yawning chasm of unreliability is fixed.

      Agile is also a tool that, when used properly, can aid in making secure software. As you mentioned, though, it still requires competent developers, a lot of time, and a lot of money. LLM's can complement developers in quickly doing the drudgework, much like IDE's and other code generators helped create boilerplate code that saved deve

    • by gweihir ( 88907 )

      Indeed. There are no shortcuts. Those that forever keep looking for them are part of the problem.

  • This is so ridiculous: The very same agencies that (a) pay tax-payer money to shady businesses to buy "zero-day-exploits" and (b) prevent fixing security holes by keeping the exploits to themselves are asking for "secure software design principles". As credible as if the Mafia asked for fair business practices.
  • The US intelligence organizations can't trol a document detailing Ukraine/Russia plans from a junior grade non active-military nazi-sympathizer twit in Midwest US, and it took them a week to find out who (on a public Discord server) posted it. They spend a lot of time pontificiating on "you should do this" or "you should do that" and then talk of "going dark" and the removal of so-called "warrant-free" encryption.

    It's hard to take someone seriously who says "Lock your valuables in the safe before you leave

  • Does it means that software will finally have end-to-end encryption and no mandatory backdoor?

    Does it means that software will be as secure as the management of documents performed by DoD?

    Did they actually perform a field research on the subject? I mean, they talk of security by design as if they have discovered America when in fact many serious companies perform security by design in their daily tasks.

    There's shit loads of know-hows, software patterns and methodologies already available to do soft
  • by HoleShot ( 1884318 ) on Saturday April 15, 2023 @07:47PM (#63452634)

    Its always about the money. Cyber security is weighed against profits. Profit always has priority. Make boards of directors, and senior management responsible for the failure of making a priority of security. Including clawing back all the perks, bonuses, and golden parachutes to pay for the losses caused by the breach. Then you will see more security because it increases the likelihood of keeping all the money they are paid.

  • This does nothing to force companies to change. Furthermore, it does nothing to ensure that government agencies will only pursue using more secure software in the future.

    THIS DOES NOTHING.

  • With all due respect to CISA, this document isn't going to fix the ongoing cybersecurity crisis. Urging companies to adopt more secure practices without any financial (dis)incentives is unlikely to make more than a dent. I wish your species well, as you struggle with basic privacy in public and how that data is brokered and monetized without informed consent from the civilian targets. Cybersecurity is only half the problem anyway.
  • Better late than never, I guess.
  • There is little incentive for most companies to write secure code, it costs more and takes longer to develop, and they have no ROI on that, nor a cost associated when things go bad.

    There are already tons of guidelines and best practices that are ignored. It still feels like the early 90's with hardcoded creds being discovered in software, and lack of buffer protections. The industry has a memory of 2 seconds.

    If it wouldn't turn into a pay to play sort of BS, I'd almost support that there be a certificatio

    • by Wyzard ( 110714 )

      If it wouldn't turn into a pay to play sort of BS, I'd almost support that there be a certification process for software where before it's released, an outside 3rd party, with no financial gain (not sure how to sort out the funding for it to avoid conflicts and still provide trust worthy results) would test the software and it would be included in the cert that software is signed with so it's easy to tell that it's been tested- to what level, and what the certification would be.

      We already have this for oth

  • If agencies like the FBI, NSA, GCHQ, ASIO, AFP etc etc are serious about wanting "more secure computing" they could stop pushing for (or making use of) laws that intentionally make things less secure and they could stop making use of security flaws or weaknesses in software (either directly by hoarding flaws themselves or indirectly by buying products from companies who themselves hoard flaws and use them to build products)

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...