For all of their purported benefits, there’s a lot we can still learn about generative AI solutions like Microsoft Copilot, particularly as it relates to data security and privacy.
Below, we’ve answered some of your most frequently asked questions about Copilot, which our team continues to study to identify opportunities and challenges.
A: Copilot interacts with sensitive data – so it can absolutely put that data at risk if proper guardrails aren’t in place first! At its core, Copilot operates and accesses data based on the permissions that have been established within your 365/Azure environment. In many cases – and we’ve seen this firsthand – users within your organization can theoretically have permissions they aren’t even aware of. In a circumstance like this, an employee could prompt Copilot to generate data that they really should not have access to – and which could pose a data privacy risk. Shared mailbox data and data & documents shared to group SharePoint sites are two good examples of potential risk areas that could inadvertently expose sensitive information if not structured in a way to limit permissions and access.
A: Absolutely. While Copilot can be purchased with Microsoft 365 Business Standard licensing or greater, the security controls needed to effectively protect your data are only included in M365 Business Premium, E3, and E5 licenses. Outside of these tiers, there’s a concerning lack of permission controls that could ultimately lead to increased security risks for your organization. At Omega Systems, we strongly recommend companies that want to leverage Copilot do so with E3 or E5 licenses, which allow for the most stringent and flexible data governance capabilities.
A: Copilot respects the sensitivity labels and data loss prevention controls configured in your Microsoft Azure environment. If access to the data is restricted for the user, it is restricted for Copilot as well. The key is to ensure sensitivity labels are rolled out and leveraged effectively across your organization – another important effort that needs to be addressed prior to Copilot adoption.
A: We continue to stress to customers that there are quite a few areas of concern posed by Copilot, and they extend well beyond just technical challenges. A well-aligned effort between business and IT leaders should center on some of these areas as part of Copilot readiness conversations:
A: Getting your Microsoft 365 tenant ready for Copilot is something organizations need to strategically prepare for. ‘Set it and forget it’ is not a viable strategy when it comes to AI adoption. We might sound like a broken record, but a comprehensive data governance strategy remains the most critical measure of an organization’s readiness for Copilot. Your organization should perform a full data and security audit to ensure that data is secure, and permissions are restricted to only the users that need access.
Omega Systems continues to keep a close eye on developments related to Copilot and is eager to advise and assist customers with their questions, concerns and readiness strategy. Contact your account manager or our Sales team to start a conversation about the future of Copilot for your company.