Congress Bans Staff Use of Microsoft's AI Copilot (axios.com) 32
The U.S. House has set a strict ban on congressional staffers' use of Microsoft Copilot, the company's AI-based chatbot, Axios reported Friday. From the report: The House last June restricted staffers' use of ChatGPT, allowing limited use of the paid subscription version while banning the free version. The House's Chief Administrative Officer Catherine Szpindor, in guidance to congressional offices obtained by Axios, said Microsoft Copilot is "unauthorized for House use."
"The Microsoft Copilot application has been deemed by the Office of Cybersecurity to be a risk to users due to the threat of leaking House data to non-House approved cloud services," it said. The guidance added that Copilot "will be removed from and blocked on all House Windows devices."
"The Microsoft Copilot application has been deemed by the Office of Cybersecurity to be a risk to users due to the threat of leaking House data to non-House approved cloud services," it said. The guidance added that Copilot "will be removed from and blocked on all House Windows devices."
Data accuracy issue (Score:3, Funny)
Re: (Score:2)
As does Congress...
Re: (Score:1)
That's correct. You can tell when they are lying, by watching their lips. If the lips are moving, they are lying.
Re: (Score:2)
Alternatively, if their fingers are moving across keyboards.
Vale the 'Bixby button' (Score:2)
That didn't work for Samsung's Android value-add.
Microsoft are now mandating that everyone go out and buy a new keyboard for an intrusive feature their government has already outlawed.
Re: (Score:2)
They'll pry my IBM Model M from my cold, dead hands...
Re: Vale the 'Bixby button' (Score:2)
If you're using an original IBM model M today then it's likely not going to be long until your hands are cold and dead.
Re: (Score:2)
Re: (Score:2)
As long as that key can be remapped, I'm actually in favor. I could use a new key for some games I play.
If it's just yet another key that I have to buy a more expensive keyboard for so I can at least switch it off in the keyboard firmware, I pass.
Of All Entities To Ban It (Score:2)
The ones that need it the most.
Re: (Score:2)
Pretty sure politicians of a certain party have been using it for a while, based on the ridiculous things their legislators tend to say these days. They even react like ChatGPT when the absursidy of some of their statements is pointed out to them, doubling down on their ridiculousness.
Re: (Score:3)
Well, of course. Do as I say, not as I do. You know, the Christian way. Or the Jewish way. Or the (insert religion here) way.
Re: (Score:2)
As I shit upon them? Can we call it a chili flyover?
You need to get some better glasses. You seem to miss the forest for the pile of feces.
Re: (Score:2)
Well, say what you want, but irreligious people are at least one delusion away from their religious counterparts.
Re: (Score:2)
Pretty sure politicians of a certain party have been using it for a while, based on the ridiculous things their legislators tend to say these days. They even react like ChatGPT when the absursidy of some of their statements is pointed out to them, doubling down on their ridiculousness.
You mean the politicians who think that putting on a dress magically turns you into a girl? I know, right??
Windows is a threat (Score:5, Interesting)
"The Microsoft Copilot application has been deemed by the Office of Cybersecurity to be a risk to users due to the threat of leaking House data to non-House approved cloud services," - Hell, that describes Windows perfectly as well. Just replace "Microsoft Copilot" with "Microsoft Windows".
The free version... (Score:1)
The free version leaks from cloud to cloud, but limited use of the "paid" version, as well all know what a bang-up job Microsoft does with security, especially with cloud computing...
https://arstechnica.com/securi... [arstechnica.com]
Great idea of Congressional policy trusting the paid version...paying for the same shitty security. The enshittification is now of Congress following Microsoft's example.
JoshK.
Re: (Score:2)
The free version leaks from cloud to cloud, but limited use of the "paid" version, as well all know what a bang-up job Microsoft does with security, especially with cloud computing...
The problem with leaking information has nothing to do with traditional security issues. The problem is that any query can become data that modifies or trains the model.
All sane companies already have policies about what can be included in an external gen AI query. For example, generic code questions are fine, but copying any portion of any proprietary code is forbidden.
Re: (Score:2)
And you think AI can tell the difference? This is a landmine waiting to explode as soon as an AI uses OSS code to answer a question for someone who then wants to use that code in his CSS project.
Re: (Score:2)
The problem with leaking information has nothing to do with traditional security issues. The problem is that any query can become data that modifies or trains the model.
All sane companies already have policies about what can be included in an external gen AI query. For example, generic code questions are fine, but copying any portion of any proprietary code is forbidden.
Right. And to be clear, the danger scenario would be someone enters some sensitive data in a prompt, uses the resulting output, the sensitive data becomes part of the model fine-tuning, and a different user gets that same sensitive data in another result, "Sensitive" doesn't have to mean classified data. It could be something embarrassing, like a strategy email dissing an opponent.
Re: (Score:2)
The paid version of ChatGPT leaks like crazy. "Fine tuning" jobs straight up steal your training data.
I was recently trying it for a chatbot for a call center company's client. I was training a model, but only trained it with the prompt:
"You're a support operator. The client will ask you general customer support questions and you'll respond according to your training". I gave it the minimum amount of examples it requires (10 i think) with questions such as "what's my account balance" and answers like "I'm u
I was thinking about something to come up with (Score:2)
I was thinking about something to come up with, but in the end I couldn't find any. In the end, at last something that makes sense I guess. /s
local LLMs (Score:2)
For similar reasons (I work on projects with serious security demands) I've gone down the rabbit hole to get local LLMs working and I'm pretty happy now, but it was quite a journey.
We now have stuff like Ollama and LM Studio that can run models locally, open models that have sufficiently large rolling windows, and things like privateGPT as a glue to feed in your own documents. Or Anything LLM if you want an all-in-one solution (though in my tests it didn't quite work as well).
We're getting there. In a few y
Re: (Score:2)
Local LLMs can cover only some use cases. Many use cases, such as those that might be used in the crafting of laws, or doing research on current events or data, wouldn't work well with a local LLM. The problem is that in these cases, the source data is not local, even if the LLM is.
Re: (Score:2)
Yes, there are some use cases where you want the LLM to essentially be a search engine on steroids. In that case, you need one that's online and vacuums up the Internet every so often. Essentially Google 2.
But for a lot of use cases a model that is occasionally updated will do just fine.
Re: (Score:2)
With or without ChatGPT, Windows is a very leaky platform, as is Mac OS and every other OS that can run a web browser.
LLMs can be good (Score:1)
The major hurdle, at least as far as I see, is that institutions have to make blanket statements to prevent people (those few? hope so) from getting lazy, especially under time co