Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Courts Government

US Sues Georgia Tech Over Alleged Cybersecurity Failings As a Pentagon Contractor (theregister.com) 37

The Register's Connor Jones reports: The U.S. is suing one of its leading research universities over a litany of alleged failures to meet cybersecurity standards set by the Department of Defense (DoD) for contract awardees. Georgia Institute of Technology (GIT), commonly referred to as Georgia Tech, and its contracting entity, Georgia Tech Research Corporation (GTRC), are being investigated following whistleblower reports from insiders Christopher Craig and Kyle Koza about alleged (PDF) failures to protect controlled unclassified information (CUI). The series of allegations date back to 2019 and continued for years after, although Koza was said to have identified the issues as early as 2018.

Among the allegations is the suggestion that between May 2019 and February 2020, Georgia Tech's Astrolavos Lab -- ironically a group that focuses on cybersecurity issues affecting national security -- failed to develop and implement a cybersecurity plan that complied with DoD standards (NIST 800-171). When the plan was implemented in February 2020, the lawsuit alleges that it wasn't properly scoped -- not all the necessary endpoints were included -- and that for years afterward, Georgia Tech failed to maintain that plan in line with regulations. Additionally, the Astrolavos Lab was accused of failing to implement anti-malware solutions across devices and the lab's network. The lawsuit alleges that the university approved the lab's refusal to deploy the anti-malware software "to satisfy the demands of the professor that headed the lab," the DoJ said. This is claimed to have occurred between May 2019 and December 2021. Refusing to install anti-malware solutions at a contractor like this is not allowed. In fact, it violates federal requirements and Georgia Tech's own policies, but allegedly happened anyway.

The university and the GTRC also, it is claimed, submitted a false cybersecurity assessment score in December 2020 -- a requirement for all DoD contractors to demonstrate they're meeting compliance standards. The two organizations are accused of issuing themselves a score of 98, which was later deemed to be fraudulent based on various factors. To summarize, the issue centers around the claim that the assessment was carried out on a "fictitious" environment, so on that basis the score wasn't given to a system related to the DoD contract, the US alleges. The claims are being made under the False Claims Act (FCA), which is being utilized by the Civil Cyber-Fraud Initiative (CCFI), which was introduced in 2021 to punish entities that knowingly risk the safety of United States IT systems. It's a first-of-its-kind case being pursued as part of the CCFI. All previous cases brought under the CCFI were settled before they reached the litigation stage.

This discussion has been archived. No new comments can be posted.

US Sues Georgia Tech Over Alleged Cybersecurity Failings As a Pentagon Contractor

Comments Filter:
  • by Chris Mattern ( 191822 ) on Friday August 23, 2024 @07:07PM (#64730562)

    ...a DoD spokesman said, "It was a rambling wreck."

  • I've been involved with many things over the years, including oil platforms and DoD-sponsored nuclear fusion projects. I have never ever seen any large institution completely implement NIST 800-171. The guidelines themselves use tons of weasel words and outs like "adequate", "appropriate", "periodical" and "as needed" but then don't define what that means, it says stuff about encryption but then doesn't state any minimum requirements. It also allows for some modifications to the extent of the controls but t

    • Sounds like what they really cared about in this case was detailed security auditing and not just antivirus software

    • That goes for all NIST guidelines, few of them have definitive specifications on what is required when it comes to Cyber Security and there is a valid reason for it. It's also why you simply go for industry best practices, and it covers your ass in 99% of cases.

      But in this case, the simple fundamental requirement of having basic malware detection/protections on endpoints wasn't even done... this is the simplest and easiest one to have. And they failed to do it... then they performed an audit and lied abou

      • The problem with EDR is that no product is universal and NIST allows you to tailor it so that an exception can be had with mitigating processes. Hence why the article said the University had approved the exception, typically that process is sufficient to still be compliant. It is hard to find a single product EDR that works on everything from Windows XP to BSD and Solaris for example, all of which are still common in academia, especially this type of physics research.

        • This wasn't a technical issue (be it AV/EDR/XDR/XXXXXX) , where they weren't able to put in, and instead put in mitigating solutions/processes- the professor didn't want it installed. And the university said "sure, you're special and can dictate counter to the contractual obligations". The university decided to over ride the contractual obligations they signed with the DoD. (This was hubris.)

          Even if it was a technical issue, where a singular solution doesn't work on ALL environments, you get one that wo

          • On the topic of the culture where a professor didn't want an antivirus in a Cybersecurity lab, stating it was a "no-go". My initial reaction is that the antivirus might remove or corrupt some samples being studied. You could run the virus through detection on a different machine, or using Virus Total to determine if it is a known strain of virus.

            I likened the concept of installing antivirus on machines in a lab environment, as sanitizing samples in a petri dish before putting them under a microscope. Wha
            • The technical challenge is the easiest to overcome in this story.

              You listed off a crap load and that's barely scratching the surface.

              The funny one is that a system with AV on it, can still be used for security research - just takes a bunch of effort to whitelist or exempt files/processes/locations accordingly - so that's a terrible reason for the professor to say no.

              These guys were either incompetent, lazy, or cheap. What ever the reason, glad they're being taken to court and held to account for their acti

              • How do you know what to whitelist if you've never encountered the malware you are researching before? If you know what to whitelist, then why do you need to study it further, doesn't that mean you've already documented the malware and can already remediate it properly?

                This is of course assuming that the cybersecurity research being conducted is on how malware embeds itself into the system, and thus the malware itself needs to be whitelisted. Efforts would need to be taken to avoid a "lab leak" situation,
    • by chill ( 34294 )

      They aren't weasel words, they're there because they are allowing you to be flexible for your specific circumstances. And if you honestly think "it says stuff about encryption but then doesn't state any minimum requirements", you should read the discussion sections of each control. Quote 800-171r2 "[SP 800-111] provides guidance on storage encryption technologies for end user devices." And SP800-111 goes deep into details. And as a Federal Contractor the short answer [nist.gov] is "Thou shalt use FIPS 140-3 validated

    • by ejr ( 2998 )

      This, to an extreme. Having "served" at GT during the period in question, I agree. It's not /too/ feasible in an academic research environment. You can establish some perimeter, but it's extremely leaky even in corporate or lab life.

      One reason why DoD, etc. come to academia is because of their increased flexibility. Why something that isn't "fundamental research" (DARPA terminology widely used) went to GTRC and not GTRI with subcontracts for narrow, "safe" areas is confusing. CIPHER at GTRI often was the po

      • Hmmm,... It has just occurred to me that GTRC sounds like a functional test case for CMMC. If I understand you correctly, GTRI may already be familiar with the best practices, and would not be an adequate "field test" of CMMC.

        The U.S. Government's goal at this time may be to merge GTRC and GTRI. GTRI labs could potentially operate under CMMC Level 3 controls if not just CMMC Level 2, while GTRC labs could operate under CMMC Level 1.

        However, the question is how well you can secure the perimeter between
        • by ejr ( 2998 )

          "Merging" isn't sensible. GTRI is the contracting arm. This never should have gone through GTRC (main academic campus) unless pieces are missing from this story. Something else is at play here.

          • One possibility is that GTRI reports to administration, and the administration is on the same unsecured network as the GTRC.

            Second possibility is that the government's new cybersecurity models are supposed to allow contracts to go to GTRC, IF they can achieve CMMC Level 1. Presumably, CMMC Level 1 would be managed by GTRI, for GTRC to meet the contract requirements. This might create a division in internal politics if GTRI is concerned about losing funding but still having to provide the same infrastructu
            • by ejr ( 2998 )

              Oh, you're not wrong, but the same can be said for SEI, Lincoln Labs, and other academically related FFRDCs.

              Something still smells funny here.

    • by jhallum ( 31304 )

      This is why the new Cybersecurity Maturity Model Certifications seem to be developed and starting to slowly roll out, to start putting some teeth behind this stuff and prevent the self-certification issues that have plagued a couple of schools now (Penn State is the other I think).

      Of course, the new CMMC appears to be a boondoggle by the people who set it up, but why let a good crisis go to waste...I guess?

      • Starting to not be so new. I guess it is still rolling out, though.
        I was briefly brought on-board a non-profit DoD contractor, in Georgia, no less. They had anxiety that their organization wouldn't fall under CMMC Level 1, and would need CMMC Level 2 certification. I was instructed to update internal I.T. Department policy to bring it up to CMMC 2.0 standards. I had the then existing CMMC standard 1.x, which was a bunch of line items that said refer to the NIST 800-171. May as well have been a copy and pa
      • CMMC is just an arranging of the NIST control requirements into 5 buckets to make it more obvious how an organization should proceed, with most organizations only really needing to reach level 3. The requirements do not really change under CMMC, it is just a clearer framework for prioritizing and measuring your progress from CMMC-1 to CMMC-5.
        • That's right, Level 3 was the middle of the road at one point, levels 1-5. I just looked it up in this article, and the CMMC 2.0 model reduced it to Levels 1-3. It might be very alarming to newcomers to think they would need Level 3, interpreting that as Level 5 on the old CMMC.
    • Yes, most orgs are compliant because as you note it is up to each org to define how they will comply with the 110 NIST 800-171 security controls. It is the job of the security team to define the System Security Plan, most importantly the artifacts that will be generated in the process of complying that can be audited. There does wind up being a whole lot of stupid in many of these plans, but the pain is largely self-inflicted. A good security org knows how to write a plan that will make the government ha
  • Looked at NIST SP 800-171 guidelines and they don't list specific antivirus software, but it looks like just having Windows Defender "On" would be enough?

    Did they have that off? Wondering if GIT was really that nonstandard or if someone just has it out for them.
    • by chill ( 34294 )

      Yes, actually. They just want you to use SOMETHING and keep it up to date. 800-171 is really easy to comply with. You have to really put some effort into fucking up that bad to actually get sued over non-compliance.

    • Windows Defender is only free for 5 computers, or 25 at the very most. After that you have to license Antivirus software from a vendor. If they had Defender "On", that would draw attention from Microsoft, and lead to an audit by Microsoft. Georgia Tech is one of the largest universities in the state of Georgia, and most assuredly exceeds the number of free Defender clients.

      A "cybersecurity lab" may be a low priority for limited free Defender licenses, if the lab is researching viruses.

      One of my first r
    • I donâ(TM)t think Windows Defender runs on Linux, BSD and Windows XP. Moreover I donâ(TM)t think Windows Defender counts as EDR, one of the key things we need in these spaces is a central reporting structure. It isnâ(TM)t just sufficient to have something like Avast installed, at time of inspection you have to have a central report of all current solutions on each machine and their current status and any detections.

      • Windows Defender for Endpoint, the EDR solution, does have Linux and MacOS clients. Also, Avast stipulates that the free version cannot be used for commercial or educational purposes, Georgia Tech would be required to license and use the Avast EDR solution instead.

        Admittedly, there is a lot of reading to be done. You have to at least read the article first. ArsTechnica's reporting on the matter may be somewhat better than The Register's, I haven't decided yet. Then there are the other prerequisites of th
        • by guruevi ( 827432 )

          OP was stating Windows Defender being "on" which is the baked in Windows tool for machines left unmanaged. There is no Windows Defender for Endpoint, it is Microsoft Defender for Endpoint and completely different solution than Windows Defender.

          My point about Avast is that I frequently see these things in people that don't want to be "managed" in academia, "but I have Avast and MalwareBytes installed" which is probably the case here, professor refused to install a centrally managed tool because he's supposed

          • Well, I haven't ruled out the possibility that the "cybersecurity research" was on the propagation of malware into a system. Studying MalwareBytes' response mechanisms can be a first step towards developing a research mindset. There are times when MalwareBytes is inadequate and you have to go beyond it.

            I'm not a specialist in viruses and worms, which tend to be less visible than the type of malware that MalwareBytes typically remediates. I don't know what tools would need to be used. In the past, a Virtua
  • by Midnight_Falcon ( 2432802 ) on Friday August 23, 2024 @07:47PM (#64730668)
    When it comes to security compliance, the system is generally contract-based and enforced on the honor system. Third parties don't get full access to systems. So, they rely either on self-assessment questionnaires or on third party auditors...who are usually paid by the person being audited. Those auditors have motivation to only find minor or easily fixable issues (have to say they found something!). If they're too tough, they lose the business to another firm. In the case of self assessment here, a comp sci professor who ardently believes he knows better can simply falsify responses and do as he pleases in reality.
  • I'm confused by the summary. Who was against the deployment of anti-malware software? Was it the University or the professor and why? If the professor was in charge of the lab is he being held responsible?

No problem is so formidable that you can't just walk away from it. -- C. Schulz

Working...