Topics Map > Communication and Collaboration > Office 365

Microsoft 365 Copilot with Data Protection FAQ

These are frequently asked questions regarding Microsoft 365 Copilot with data protection as implemented at the University of Illinois on the Urbana campus.

How do I get started with Copilot?

Refer to this article to get started with Microsoft 365 Copilot with data protection. 

Who has access to Copilot?

Those with a Microsoft 365 A5 license will have access to Microsoft 365 Copilot data protection. 

If you are unsure what type of Microsoft 365 license you currently have, refer to this KB 102999 article for steps on how to find out if you have an A5 license.

What privacy protections are in place when using Copilot?

We have privacy agreements in place with Microsoft to assure that Microsoft is not using our data or prompts for training of their models. Microsoft has put in safeguards to help protect for unintentional sharing of data with others in the organization. https://learn.microsoft.com/en-us/microsoftsearch/semantic-index-for-copilot  This includes controls that keep Copilot 365 from indexing files with an organization wide sharing link until you have opened those files.

Even with this safeguard in place it is advised that you only chose “share with organization” on non-sensitive data that does not fall under FERPA, HIPAA or other data regulations. 

It is important to be aware that the meeting transcriptions that Copilot 365 and other AI tools used to create meeting minutes are documents and are therefore subject to FOIA. This should be considered when deciding which meetings to transcribe and all meeting participants should be asked for consent before proceeding. 

Is it necessary to review the output of Copilot before publishing it?

In all use of generative AI it is important to continually check and edit the outputs you receive.

These tools can write very similarly to humans but are prone to create overly generic or incorrect information. Human review is important in making sure the content created meets your standards for quality, style and substance. 

From Microsoft's Copilot website:

Copilot aims to base all its responses on reliable sources - but AI can make mistakes, and third-party content on the internet may not always be accurate or reliable. Copilot will sometimes misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate. Use your own judgment and double check the facts before making decisions or taking action based on Copilot’s responses.



KeywordsMicrosoft A5 Copilot Bing Chat Enterprise AI ChatGPT   Doc ID137828
OwnerThomas N.GroupUniversity of Illinois Technology Services
Created2024-06-11 13:48:35Updated2024-07-22 15:09:11
SitesUniversity of Illinois Technology Services
Feedback  0   0