Microsoft 365 Copilot exposes an organisation’s existing security failures rather than introducing new ones. By surfacing everything a user can technically access through natural language queries, Copilot transforms years of accumulated SharePoint permission sprawl from a dormant housekeeping problem into an active data exposure risk across the organisation’s entire data estate.

The Permission Inheritance Problem

Copilot operates at the infrastructure layer of Microsoft 365, not on top of it. It retrieves content through Microsoft Graph, automatically indexing email, SharePoint, OneDrive, Teams, and calendar data. The scope is the user’s full data estate, filtered only by whatever permissions say that user can technically reach.

Microsoft’s own documentation confirms it: Copilot surfaces only data to which individual users have at least view permissions. View permissions in most tenants, however, are far broader than anyone intended.

SharePoint’s hierarchical permission model cascades from site to library to folder to file by default. When that inheritance is broken through shared links, folder-level overrides, or legacy configurations, it creates what security professionals call permission islands: items with unique, often invisible permission sets that are nearly impossible to audit manually. A critical finding from Microsoft’s own Q&A forums in 2025 confirmed that parent-level sharing links retain access to subfolders even after inheritance is explicitly broken, and administrators would need to manually check every parent folder to trace the source of access.

Varonis, analysing 717 organisations, found the average company carries over 27,000 active sharing links in Microsoft 365, with roughly half open to every employee. Ten per cent of cloud data is exposed to every employee in the organisation. The average SaaS data-breach risk exposure sits at over $28 million. Before Copilot, those links were low-risk because an employee would need to know exactly where to look, and Copilot removes that friction entirely.

The “Everyone except external users” group, a legacy Microsoft default that includes every employee automatically, is frequently assigned to SharePoint sites containing sensitive content. Default sharing link types in most tenants are set to “People in your organisation,” meaning a single click on Share grants company-wide access. Sharing links from old collaborations remain active indefinitely unless manually revoked.

A Categorically Different Risk Than Standalone AI Tools

The contrast with standalone AI tools like ChatGPT or Claude is architectural, not incremental.

Standalone tools operate at the application layer as isolated platforms that only process data a user explicitly provides, with no access to the organisation’s email, documents, or internal systems. The shadow AI risk they create is real, but its scope is limited to what someone chooses to share in a given session.

Copilot’s blast radius extends across the user’s entire M365 data estate. A single prompt can correlate and synthesise information from emails, documents, meeting transcripts, calendar entries, and Teams chats simultaneously, because Copilot’s grounding process automatically retrieves relevant context from Microsoft Graph before generating a response.

Concentric AI, analysing over 550 million records, found that 16 per cent of business-critical data is overshared in the average M365 tenant. That represents roughly 802,000 files per organisation at elevated risk. As Microsoft’s own product marketing director acknowledged publicly at Microsoft Ignite 2024, AI is very good at finding information, and it can surface more information than expected.

This does not mean Copilot is the more dangerous deployment choice by default. Standalone tools create a different exposure: employees sending proprietary data to third-party infrastructure outside organisational visibility. That is the shadow AI data leak problem that organisations are still grappling with. The point is that Copilot and standalone tools demand different governance models. Most organisations have built neither.

The Incidents That Made the Problem Concrete

In November 2024, Business Insider reported that multiple Microsoft customers discovered Copilot enabled employees to access executive inboxes and sensitive HR documents. A Microsoft employee familiar with the complaints described the issue: when an employee logs into an account and starts Copilot, they can see everything, including the CEO’s emails. The root cause was a permissions misconfiguration, not a Copilot defect. IT departments had configured lax permissions, selecting broad access options for HR systems rather than specifying users, a configuration that caused no visible problems until Copilot gave every employee a natural language query interface.

In February 2026, Microsoft confirmed a more serious incident. A bug tracked as CW1226324, affecting systems since at least January 2026, caused Copilot to read and summarise emails labelled “Confidential” in users’ sent and draft folders, bypassing DLP policies and sensitivity labels entirely. Microsoft acknowledged the code issue and rolled out a fix in February 2026, but did not disclose how many organisations were affected or whether sensitive content had been further distributed.

Security researchers have demonstrated additional attack paths. Michael Bargury at Black Hat USA 2024 released LOLCopilot, a red-teaming tool that works in any default Copilot-enabled tenant, demonstrating manipulation of Copilot to exfiltrate pre-earnings data and retrieve passwords shared through Teams. His Black Hat 2025 research, AgentFlayer, showed working zero-click exploits against M365 Copilot requiring no action from the target beyond receiving an email, because Copilot automatically consumes email content via Graph.

In January 2026, Varonis Threat Labs published details of a single-click attack called Reprompt that bypassed Copilot’s safety controls through a double-request technique, silently exfiltrating data when a user clicked a legitimate Microsoft URL.

The EchoLeak vulnerability (CVE-2025-32711), discovered in Copilot and patched in June 2025, demonstrated the most severe attack vector yet. Rated CVSS 9.3, it was the first known zero-click prompt injection exploit in a production AI system. A researcher showed that a crafted email, when processed by Copilot via its retrieval-augmented generation pipeline, could exfiltrate the target’s chat logs, OneDrive files, SharePoint content, and Teams messages to an external server. The target did not need to open the email, click a link, or interact with Copilot at all. The exploit worked because Copilot automatically consumes email content through Microsoft Graph as part of its grounding process, the same architectural feature that makes it useful for productivity. No exploitation in the wild was confirmed before the patch, but the vulnerability illustrated that Copilot’s deep integration with the M365 data estate creates attack surfaces that traditional email security does not cover (Checkmarx/arXiv, CVE-2025-32711).

A June 2025 Gartner survey of 132 IT leaders found 40 per cent of organisations had delayed Copilot rollout by three or more months specifically over oversharing concerns, and 64 per cent said information governance required significant time and resources to address before they could proceed.

Microsoft’s Response, and Where the Burden Falls

Microsoft has acknowledged oversharing as one of the most common risks organisations encounter when deploying Copilot, and has invested substantially in mitigation tooling.

The response has proceeded in stages. Restricted SharePoint Search, released in March 2024, offered a temporary stopgap limiting Copilot’s search scope to 100 allowlisted sites. Microsoft described it explicitly as not intended for long-term use. SharePoint Advanced Management, previously a separate paid add-on, was bundled free with every Copilot licence in early 2025, including permission state reports, site access reviews, and Restricted Content Discovery. The Copilot Blueprint for Oversharing, published in December 2024, provides a three-phase deployment approach with specific E3 and E5 guidance. At Ignite 2025, Microsoft released the Copilot Control System, adding DLP policies for Copilot prompts, enhanced Purview data security posture management, and an AI-powered SharePoint Admin Agent.

The toolkit is now comprehensive, but the operational challenge persists: auditing permissions across a complex tenant with millions of files, thousands of sharing links, and years of accumulated configuration debt is an enormous undertaking that tooling can assist but not replace. Gartner noted in 2025 that securing M365 Copilot at scale is complex, with controls split across multiple admin centres often managed by different IT teams.

Gartner’s prediction for the broader pattern is striking: through 2026, at least 80 per cent of unauthorised AI transactions will be caused by internal violations of enterprise policies around information oversharing, unacceptable use, or AI misuse, rather than malicious external attacks. Misconfigured permissions meeting a powerful retrieval system is the threat, not external hackers.

The Channel Gap: How Copilot Reaches Most Businesses

For large enterprises with internal IT teams and dedicated security functions, the oversharing problem is at least visible. The governance documentation exists, the tooling is available, and someone in the organisation has the remit to ask whether permissions are ready before deployment begins. For small and mid-sized businesses, the path to Copilot runs almost entirely through managed service providers, and that changes the dynamic considerably.

MSPs operate under Microsoft’s Cloud Solution Provider program, reselling Microsoft 365 licences and managing tenants on behalf of their customers. Under Microsoft’s FY26 Microsoft Commerce Incentives structure, Copilot is classified as a Strategic Product Accelerator Tier 2, the highest incentive tier. Partners can stack a 3.75 per cent core rate, a 7 per cent strategic accelerator, and a 7.5 per cent growth accelerator, reaching a combined potential of 18.25 per cent on Copilot revenue. Standard M365 products like Business Premium and E3 cap out around 14.25 per cent. Microsoft also runs fixed-fee engagement payments through its Copilot+Power Accelerate program, paying partners up to $75,000 for large deployment accelerators.

Channel Futures reported that Microsoft’s Chief Partner Officer described FY26 as a record year of incentives, with Copilot funding up 50 per cent year-over-year. The financial structure rewards seat deployment. What it does not require is governance readiness. Any CSP partner can sell Copilot licences without completing a certification or specialisation. Microsoft’s voluntary Copilot Specialisation, launched July 2025, does include security credential requirements, but it is a differentiation badge for partners who pursue it, not a gate that customers can use to verify their MSP has assessed deployment risk.

Microsoft’s partner-facing sales materials lead with demos, ROI calculators, and pipeline tools. Governance guidance appears as linked reference material on learn.microsoft.com rather than as a mandatory phase in the commercial enablement layer where most partners operate.

The result for SMB customers is a confidence gap that is structural rather than intentional. A business whose MSP has managed its Microsoft 365 environment for years reasonably assumes that the same provider rolling out Copilot has assessed whether the environment is ready for it. Gartner’s June 2025 survey found 64 per cent of organisations said information governance required significant time and resources before they could proceed with Copilot deployment, but that figure reflects organisations with enough internal capability to identify the concern in the first place. Businesses that rely entirely on an MSP for IT governance may not have anyone asking that question. Varonis has reported a consistent pattern during pilot programmes: customers enable Copilot for a small group and quickly discover significant privacy and security issues they had not anticipated. For businesses without a pilot phase, that discovery comes after full deployment.

The MSP channel is not the source of the problem. SharePoint permission sprawl predates Copilot by years, and MSPs inherit whatever governance debt their customers have accumulated. The structural issue is that the incentive architecture currently rewards the partner who deploys fastest over the partner who deploys most carefully, and most SMB customers have no straightforward way to tell the difference.

What Organisations Need to Do Before Deployment

Copilot deployment readiness is inseparable from data governance maturity. Organisations that have not addressed permission debt are not ready to deploy at scale, regardless of what Microsoft’s tooling can do after the fact.

Permissions audit before deployment. A full inventory of SharePoint sharing links, EEEU group assignments, and broken inheritance chains should precede any broad Copilot enablement. The SharePoint Advanced Management permission state reports provide a starting point. The goal is to understand actual exposure, not assume it is acceptable.

Sensitivity labelling at scale. Copilot respects sensitivity labels, but only for content that has them. Most organisations have labelled a fraction of their data estate. The February 2026 bug demonstrated that even labelled content carries no guarantee, but labels remain the strongest signal Copilot has about how to treat content.

Restrict before expanding. Restricted SharePoint Search or Restricted Access Control can limit Copilot’s scope during initial rollout. Access should be extended as permission remediation progresses, not before it begins.

Classify AI access risk separately from human access risk. A document that was low-risk when only a handful of people knew it existed carries different risk when every employee can query it in natural language. Risk classifications built for human browsing behaviour do not map directly to AI-assisted retrieval.

Treat Copilot governance as data governance, not IT configuration. The underlying problem is years of permission debt, not a product setting. Addressing it requires a cross-functional programme involving legal, compliance, HR, and IT, with ownership at a level that can fund and sustain remediation work.

Ask your MSP the right questions. Businesses that rely on a managed service provider for Microsoft 365 should ask specifically whether a SharePoint permissions audit was conducted before Copilot was enabled, and what the findings were. If the MSP cannot answer that question, the audit did not happen.

Related reading: What is shadow AI? | What does shadow AI cost a business in 2026? | What is an AI governance framework? | AI Usage Policy Template (free download)

Sources