Microsoft May Store Your Conversations With Bing If You're Not an Enterprise User (theregister.com) 13
An anonymous reader quotes a report from The Register: Microsoft prohibits users from reverse engineering or harvesting data from its AI software to train or improve other models, and will store inputs passed into its products as well as any output generated. The details emerged as companies face fresh challenges with the rise of generative AI. People want to know what corporations are doing with information provided by users. And users are likewise curious about what they can do with the content generated by AI. Microsoft addresses these issues in a new clause titled 'AI Services' in its terms of service.
The five new policies, which were introduced on 30 July and will come into effect on September 30, state that:
Reverse Engineering. You may not use the AI services to discover any underlying components of the models, algorithms, and systems. For example, you may not try to determine and remove the weights of models.
Extracting Data. Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services.
Limits on use of data from the AI Services. You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.
Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.
Third party claims. You are solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services). A spokesperson from Microsoft declined to comment on how long the company plans to store user inputs into its software. "We regularly update our terms of service to better reflect our products and services. Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers," the representative told us in a statement.
Microsoft has previously said, however, that it doesn't save conversations or use that data to train its AI models for its Bing Enterprise Chat mode. The policies are a little murkier for its Microsoft 365 Copilot, although it doesn't appear to use customer data or prompts for training, it does store information. "[Copilot] can generate responses anchored in the customer's business content, such as user documents, emails, calendar, chats, meetings, contacts, and other business data. Copilot combines this content with the user's working context, such as the meeting a user is in now, the email exchanges the user has had on a topic, or the chat conversations the user had last week. Copilot uses this combination of content and context to help deliver accurate, relevant, contextual responses," it said.
The five new policies, which were introduced on 30 July and will come into effect on September 30, state that:
Reverse Engineering. You may not use the AI services to discover any underlying components of the models, algorithms, and systems. For example, you may not try to determine and remove the weights of models.
Extracting Data. Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services.
Limits on use of data from the AI Services. You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.
Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.
Third party claims. You are solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services). A spokesperson from Microsoft declined to comment on how long the company plans to store user inputs into its software. "We regularly update our terms of service to better reflect our products and services. Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers," the representative told us in a statement.
Microsoft has previously said, however, that it doesn't save conversations or use that data to train its AI models for its Bing Enterprise Chat mode. The policies are a little murkier for its Microsoft 365 Copilot, although it doesn't appear to use customer data or prompts for training, it does store information. "[Copilot] can generate responses anchored in the customer's business content, such as user documents, emails, calendar, chats, meetings, contacts, and other business data. Copilot combines this content with the user's working context, such as the meeting a user is in now, the email exchanges the user has had on a topic, or the chat conversations the user had last week. Copilot uses this combination of content and context to help deliver accurate, relevant, contextual responses," it said.
Re:Before you post⦠(Score:5, Insightful)
Indeed. This is a free service. Of course they are harvesting your data. Duh. That's how it works.
Don't like it? Then pay up, or don't use it.
Re:Before you post⦠(Score:4, Insightful)
I'd personally be suspicious that even if I was paying Microsoft, that they'd still keep all my data - they just wouldn't tell me about it, and would otherwise hide the ways they use it from me.
Re: (Score:2)
The word "suspicion" is correctly applied only to cases when there's no or flimsy evidence.
Well deserved (Score:4, Insightful)
At this point, anybody that actively chooses Microsoft products for anything at all when any alternatives exist deserves all they get.
Re: (Score:2)
Well, two-and-a-half decades ago they made "Microsoft 2.0a" mice, which happened to be one of best ball mice I've ever seen.
When it comes to software products, or any hardware from this millenium, though, you're 100% right.
That's OK (Score:3)
They are silly serious reverse engineering. (Score:2)
One of my first and last conversations with Bing was how it worked, and underlying *connectionist* philosophy of mind. It was actually pretty good, but when I returned to the conversation, they had freaked out, and all the sudden Bing could not talk about how Bing worked. Microsoft war with human awareness continues.
What should small businesses do? (Score:4, Informative)
Many individual and small business consultants work with sensitive company data. If MS is training AIs on MS Office documents, that sensitive information could be revealed by queries by outside users.
Its easy to say "don't use MIcrosoft" but Office formats are used in a lot of businesses and those could be securely transmitted to the consultant - only to be scanned by the Microsoft AI when the document is opened.
Its not clear from the article whether existing documents are scanned, and very few users will have the time and expertise to read through the EULA in sufficient detail to understand under what conditions that happens
This isn't the news, (Score:2)
It would be news if Microsoft wasn't storing your conversations.