Cyber security banner

Q&A: Microsoft Copilot Data Security

microsoft copilot security q&a

For all of their purported benefits, there’s a lot we can still learn about generative AI solutions like Microsoft Copilot, particularly as it relates to data security and privacy.

Below, we’ve answered some of your most frequently asked questions about Copilot, which our team continues to study to identify opportunities and challenges.

Frequently Asked Questions about Microsoft Copilot

Q: Can Copilot for 365 put my sensitive data at risk?

A: Copilot interacts with sensitive data – so it can absolutely put that data at risk if proper guardrails aren’t in place first! At its core, Copilot operates and accesses data based on the permissions that have been established within your 365/Azure environment. In many cases – and we’ve seen this firsthand – users within your organization can theoretically have permissions they aren’t even aware of. In a circumstance like this, an employee could prompt Copilot to generate data that they really should not have access to – and which could pose a data privacy risk. Shared mailbox data and data & documents shared to group SharePoint sites are two good examples of potential risk areas that could inadvertently expose sensitive information if not structured in a way to limit permissions and access.

Q: Does it matter which 365 license I use with Copilot?

A: Absolutely. While Copilot can be purchased with Microsoft 365 Business Standard licensing or greater, the security controls needed to effectively protect your data are only included in M365 Business Premium, E3, and E5 licenses. Outside of these tiers, there’s a concerning lack of permission controls that could ultimately lead to increased security risks for your organization. At Omega Systems, we strongly recommend companies that want to leverage Copilot do so with E3 or E5 licenses, which allow for the most stringent and flexible data governance capabilities.

Q: How does Copilot work with sensitivity labels?

A: Copilot respects the sensitivity labels and data loss prevention controls configured in your Microsoft Azure environment. If access to the data is restricted for the user, it is restricted for Copilot as well. The key is to ensure sensitivity labels are rolled out and leveraged effectively across your organization – another important effort that needs to be addressed prior to Copilot adoption.

Q: When it comes to Copilot, what else should my organization be concerned about?

A: We continue to stress to customers that there are quite a few areas of concern posed by Copilot, and they extend well beyond just technical challenges. A well-aligned effort between business and IT leaders should center on some of these areas as part of Copilot readiness conversations:

  • Training: Effective use of Copilot requires users to be trained on its capabilities and proper prompt usage.
  • Business strategy: There should be clearly defined goals for how Copilot – and AI more broadly – will be used across your environment and what your business and employees hope to accomplish with the tool.
  • Policy & Procedure: AI-centered Acceptable Use Policies will be a necessity for companies rolling out Copilot across their orgs. A written policy should be developed and reviewed amongst leadership to define what employees can and cannot do with company-provided AI tools.
  • Ethics: Public discussions continue around the ethical implications of AI assistant tools and their potential to impact or displace human workers.
  • Cost: Depending on the size of your organization and the extent to which your employee base requires Copilot licenses, cost could be a barrier, especially for mid-sized and enterprise firms.
  • Growing pains: Given the constant evolution of Copilot features and benefits, it’s not wrong to ask yourself ‘how ready is it for primetime?’ Continued learning and slower adoption may be prudent for companies who don’t want to get too far ahead of the curve and find themselves dealing with growing pains 6, 12 or 24 months from now.

Q: How can companies start readying their data prior to Copilot adoption?

A: Getting your Microsoft 365 tenant ready for Copilot is something organizations need to strategically prepare for. ‘Set it and forget it’ is not a viable strategy when it comes to AI adoption. We might sound like a broken record, but a comprehensive data governance strategy remains the most critical measure of an organization’s readiness for Copilot. Your organization should perform a full data and security audit to ensure that data is secure, and permissions are restricted to only the users that need access.

Omega Systems continues to keep a close eye on developments related to Copilot and is eager to advise and assist customers with their questions, concerns and readiness strategy. Contact your account manager or our Sales team to start a conversation about the future of Copilot for your company.

Previous ArticlePassword Security 2024: Practical Tips for Stronger Protection
Next Article What Is Shadow IT? Examples & Risks Explained