Microsoft Copilot Security: 10 Critical Controls Every Enterprise Needs
Microsoft 365 Copilot is the most powerful data access tool your employees have ever had. These 10 security controls---from DLP policies and sensitivity labels to insider risk management and information barriers---are the minimum baseline for any enterprise deployment.
Copilot Consulting
February 23, 2025
24 min read
In This Article
Microsoft 365 Copilot is the most powerful data access tool your employees have ever had. It can search across emails, documents, chats, meeting transcripts, and SharePoint sites in seconds. It synthesizes information from dozens of sources into coherent responses. It operates at a speed and scale that no human can match.
This is exactly why it is a security challenge. Every permission gap, every overshared document, every sensitivity label that was never applied---Copilot finds and exposes them all. The AI does not create security vulnerabilities. It reveals them at machine speed.
These 10 controls are not optional hardening measures. They are the minimum security baseline for any enterprise Copilot deployment. Organizations that skip these controls will discover their security gaps through incidents rather than assessments.
Control 1: Data Loss Prevention (DLP) Policies
DLP is the first line of defense against sensitive data exposure through Copilot. Without DLP policies, Copilot can freely surface PII, financial data, health records, and intellectual property in responses to any user with access.
What to Configure
Sensitive Information Types: Create DLP policies for every sensitive information type present in your Microsoft 365 environment:
- Social Security Numbers, driver's license numbers, passport numbers
- Credit card numbers, bank account numbers, routing numbers
- Medical record numbers, diagnosis codes, prescription information
- Intellectual property identifiers (patent numbers, trade secret markers)
- Employee identification numbers, compensation data
Policy Actions:
- Block: Prevent Copilot from including detected sensitive data in responses
- Warn: Allow the response but notify the user and log the event
- Encrypt: Require sensitivity label encryption on Copilot outputs containing sensitive data
- Notify: Alert compliance team when Copilot surfaces sensitive data types in responses
Policy Scope: Apply DLP policies to all locations Copilot accesses:
- Exchange Online (email content)
- SharePoint Online (documents and lists)
- OneDrive for Business (personal files)
- Microsoft Teams (chat and channel messages)
- Copilot interactions (new policy location available in Purview)
Implementation Priority
Start with the highest-risk data types for your industry:
- Healthcare: PHI (diagnosis codes, patient identifiers, prescription data)
- Financial Services: PII + financial data (account numbers, transaction details, credit scores)
- Government: CUI (Controlled Unclassified Information categories)
- General Enterprise: PII + intellectual property
Test DLP policies in simulation mode for 2 weeks before enabling enforcement. Review false positives and refine sensitive information type definitions before blocking content.
Control 2: Sensitivity Labels
Sensitivity labels are the primary mechanism for controlling what content Copilot can access and how AI-generated output is classified. Without sensitivity labels, all documents are treated equally by Copilot regardless of their actual sensitivity.
Label Taxonomy for Copilot
Implement a minimum four-tier label structure:
Public: Content that can be freely shared. Copilot can access and include in any response.
Internal: Content for internal use only. Copilot can access for internal users but prevents inclusion in responses shared externally.
Confidential: Content restricted to specific groups. Copilot can access only if the requesting user has explicit permission. Encryption enforced.
Highly Confidential: Content with the most restrictive access. Copilot access requires the user to have decryption rights. All Copilot interactions with this content are logged and audited.
Copilot-Specific Label Behaviors
Label Inheritance: When Copilot generates content derived from labeled sources, the output inherits the highest sensitivity label from the source documents. If Copilot creates a summary from one "Internal" document and one "Confidential" document, the summary receives the "Confidential" label.
Encryption Enforcement: For labels with encryption, Copilot checks whether the requesting user has decryption rights before accessing the content. Users without rights will not see content from encrypted documents in Copilot responses.
Auto-Labeling: Configure auto-labeling policies that apply sensitivity labels to Copilot-generated content based on detected sensitive information types. A Copilot-generated document containing credit card numbers should automatically receive at least "Confidential" classification.
Implementation Steps
- Publish sensitivity labels to all users via Microsoft Purview
- Configure default labels for each Microsoft 365 application (Word: Internal, Email: Internal, SharePoint: Internal)
- Enable mandatory labeling---require users to apply a label before saving or sending
- Configure auto-labeling policies for high-risk sensitive information types
- Test label inheritance with Copilot by generating content from labeled source documents
- Validate that encryption enforcement prevents unauthorized Copilot access
Control 3: Conditional Access Policies
Conditional access controls when, where, and how users can access Copilot. This is the enforcement mechanism that ensures Copilot is only available under conditions your organization deems acceptable.
Essential Conditional Access Policies for Copilot
Policy 1: Require Compliant Devices
- Target: All users with Copilot licenses
- Condition: Device must be enrolled in Intune and compliant with device policies
- Grant: Allow access only from compliant devices
- Effect: Blocks Copilot access from personal, unmanaged devices
Policy 2: Block High-Risk Sign-Ins
- Target: All users with Copilot licenses
- Condition: Sign-in risk level = High (Entra ID Protection)
- Grant: Block access
- Effect: Prevents compromised accounts from using Copilot to exfiltrate data
Policy 3: Require MFA for Copilot
- Target: All users with Copilot licenses
- Condition: Any access to Microsoft 365 Copilot service
- Grant: Require multifactor authentication
- Effect: Adds authentication assurance before AI-powered data access
Policy 4: Location-Based Restrictions
- Target: Users handling classified or export-controlled data
- Condition: Access from outside approved geographic regions
- Grant: Block access
- Effect: Enforces data sovereignty and export control requirements
Policy 5: Session Controls for Unmanaged Devices
- Target: Users accessing Copilot from unmanaged devices (if allowed)
- Condition: Device not enrolled in Intune
- Session: App-enforced restrictions, block download
- Effect: Allows limited Copilot access but prevents data extraction
Testing and Validation
Before enforcing conditional access policies:
- Run policies in report-only mode for 2 weeks
- Analyze the impact report---identify users and scenarios that would be blocked
- Create exclusion groups for legitimate exceptions (break-glass accounts, service accounts)
- Enable enforcement in phases---start with high-risk users, then expand to all users
- Monitor sign-in logs for unexpected blocks and adjust policies as needed
Control 4: Audit Logging and Monitoring
If you cannot see what Copilot is doing, you cannot govern it. Audit logging captures every Copilot interaction for security monitoring, compliance reporting, and incident investigation.
Audit Log Configuration
Enable Microsoft Purview Audit (Standard):
- Captures Copilot invocations across all Microsoft 365 applications
- Default retention: 180 days
- Available to all E3 and above licenses
Upgrade to Microsoft Purview Audit (Premium):
- Extended retention: up to 10 years
- High-bandwidth access to audit log API
- Intelligent insights on user activity
- Required for organizations with regulatory retention mandates
Critical Audit Events for Copilot:
CopilotInteraction: User invoked Copilot (captures application, query type, timestamp)CopilotDataAccess: Copilot accessed specific content (document ID, mailbox item, channel message)CopilotOutputGenerated: Copilot generated a response (output type, sensitivity level)CopilotDLPTrigger: Copilot interaction triggered a DLP policyCopilotSensitivityInheritance: Copilot-generated content inherited a sensitivity label
SIEM Integration
Export Copilot audit events to your SIEM platform for correlation:
- Microsoft Sentinel: Native integration via diagnostic settings
- Splunk: Use Microsoft 365 Management Activity API connector
- IBM QRadar: Configure Office 365 DSM for Copilot events
- Other SIEM: Use Management Activity API with custom integration
Monitoring Use Cases
Anomaly Detection:
- User generates 10x more Copilot queries than their historical average (potential data exfiltration)
- Copilot accesses documents in a library the user has never visited (permission exploitation)
- Multiple DLP triggers from a single user's Copilot interactions in one session
- Copilot interactions from unusual locations or devices
Compliance Monitoring:
- Weekly report of all Copilot interactions with "Highly Confidential" content
- Monthly report of DLP policy triggers related to Copilot
- Quarterly summary of Copilot usage across regulated data classifications
- Annual audit of Copilot access patterns for regulatory review
Control 5: Insider Risk Management
Copilot can amplify insider threats by enabling faster, broader data access than traditional tools. A malicious insider can use Copilot to systematically query for sensitive data across the entire Microsoft 365 environment in minutes.
Copilot-Specific Insider Risk Indicators
Configure Microsoft Purview Insider Risk Management to detect:
Data Exfiltration Indicators:
- High-volume Copilot queries requesting specific data types (salary data, customer lists, financial projections)
- Copilot interactions followed by large file downloads or external sharing
- Copilot queries from users who have submitted resignation (correlate with HR data)
- Pattern of Copilot queries that progressively access more sensitive content
Policy Violation Indicators:
- Copilot queries that repeatedly trigger DLP policies
- Attempts to access content above the user's need-to-know level
- Copilot interactions outside normal business hours from unusual locations
- Queries that attempt to circumvent sensitivity label restrictions
Integration with HR and Legal
Insider risk management for Copilot must integrate with:
- HR Systems: Correlate Copilot activity with employment events (resignation notice, performance improvement plans, termination proceedings)
- Legal Hold: When employees are placed on legal hold, monitor their Copilot interactions for evidence spoliation
- Investigation Workflows: Provide security operations with tools to review a suspect user's complete Copilot interaction history
Control 6: SharePoint Permission Remediation
Copilot exposes every permission gap in your SharePoint environment. Before deploying Copilot, you must fix the permissions that were never designed for AI-powered data retrieval.
Priority Remediation Actions
Remove "Everyone" and "Everyone Except External Users": Search all SharePoint sites for items shared with these groups. These broad permissions were often applied for convenience and are the primary vector for Copilot data exposure.
Audit Site Collection Permissions: Review site collection administrator assignments. Users with site collection admin rights can access all content in that site through Copilot.
Review Sharing Links: Identify and revoke organization-wide sharing links on sensitive documents. These links grant Copilot access to anyone in the tenant.
Implement Least-Privilege Access: Replace broad access groups with targeted security groups based on business need. Apply the principle of least privilege to every SharePoint site, library, and document.
Automated Permission Assessment
Use SharePoint Advanced Management or third-party tools to:
- Scan all sites for overshared content
- Generate permission reports by sensitivity level
- Identify stale permissions (access granted to former employees, moved employees)
- Create remediation task lists prioritized by risk level
Control 7: Information Barriers
Information barriers prevent Copilot from surfacing content across organizational boundaries that must remain separated.
When Information Barriers Are Required
- Financial Services: Chinese Wall compliance between investment banking and research
- Legal: Conflict walls between client teams
- Mergers & Acquisitions: Isolation between deal teams and general population
- Government: Separation between classification levels or program areas
Configuration for Copilot
- Define barrier segments based on Azure AD attributes (department, custom attribute)
- Create barrier policies specifying which segments cannot communicate
- Validate that Copilot respects barriers by testing cross-segment queries
- Monitor for barrier violations in Copilot audit logs
- Review and update barriers quarterly as organizational structure changes
Control 8: Microsoft Purview Communication Compliance
Communication compliance monitors Copilot-generated content for policy violations, inappropriate language, and regulatory risks.
Copilot-Relevant Communication Compliance Policies
Regulatory Compliance:
- Monitor Copilot-generated external communications for regulatory violations
- Detect financial advice in Copilot outputs from non-licensed individuals
- Flag medical advice in Copilot responses in healthcare environments
- Identify legal advice in Copilot outputs from non-attorney users
Content Standards:
- Detect offensive or inappropriate language in Copilot-generated content
- Flag content that violates organizational communication standards
- Identify potential harassment or discrimination in AI-generated text
- Monitor for unauthorized use of confidential information in Copilot outputs
Investigation Workflows
When communication compliance flags a Copilot interaction:
- Alert is generated and assigned to compliance reviewer
- Reviewer examines the full Copilot interaction (query, retrieved content, generated output)
- Reviewer determines if a policy violation occurred
- If violation confirmed: escalate to appropriate stakeholder (HR, Legal, Management)
- Document findings and resolution in compliance case management system
Control 9: Retention and eDiscovery
Copilot-generated content creates new data that must be governed by retention policies and discoverable through eDiscovery.
Retention Policy Configuration
Copilot Interaction Data:
- Apply retention policies to Copilot interaction logs
- Align retention period with regulatory requirements (7 years for financial services, 6 years for healthcare)
- Include Copilot data in retention reviews
Copilot-Generated Content:
- Content saved to SharePoint, OneDrive, or Teams inherits the location's retention policy
- Content in Copilot Pages requires explicit retention policy assignment
- Meeting summaries and notes follow the meeting content retention policy
eDiscovery Considerations
Copilot Content in Legal Holds:
- When placing users on legal hold, verify that Copilot interaction data is preserved
- Include all content locations (Exchange, SharePoint, OneDrive, Teams) in legal holds
- Copilot-generated content may constitute discoverable evidence---consult with legal counsel
Search and Collection:
- Use Purview eDiscovery to search Copilot-generated content
- Search by date range, user, application, and content keywords
- Export Copilot interaction logs for external review if required
- Supplement eDiscovery with Purview audit log exports for complete interaction context
Control 10: Copilot Access Governance
Not every user needs Copilot. Implement access governance that ensures Copilot licenses are assigned based on business need, risk profile, and readiness.
License Assignment Strategy
Criteria for Copilot License Assignment:
- User's role benefits from AI-assisted content generation and retrieval
- User's department has completed Copilot governance training
- User's SharePoint permissions have been audited and remediated
- User's devices meet conditional access compliance requirements
- User has acknowledged the acceptable use policy for AI
License Review Cadence:
- Monthly: Review active usage vs. licensed users (identify unused licenses)
- Quarterly: Assess new license requests against assignment criteria
- Annually: Full audit of license assignments against current organizational needs
Role-Based Access Tiers
Consider implementing tiered Copilot access based on role:
Tier 1: Full Access (executives, senior managers, knowledge workers)
- All Copilot features across all Microsoft 365 applications
- Access to meeting intelligence, channel summarization, document generation
- Subject to all 10 security controls
Tier 2: Restricted Access (operational staff, temporary employees)
- Copilot limited to specific applications (e.g., Word and Excel only)
- No access to meeting intelligence or channel summarization
- Additional DLP restrictions on Copilot-generated output
Tier 3: No Access (contractors, interns, users in high-risk roles)
- Copilot license not assigned
- Access blocked via conditional access policy as backup
- Review for upgrade quarterly based on role changes
Implementation Priority Matrix
If you cannot implement all 10 controls simultaneously, prioritize in this order:
| Priority | Control | Reason | |---|---|---| | 1 | Sensitivity Labels | Foundation for all other controls | | 2 | SharePoint Permission Remediation | Largest attack surface reduction | | 3 | DLP Policies | Prevents sensitive data exposure | | 4 | Conditional Access | Controls who can use Copilot and from where | | 5 | Audit Logging | Required for all monitoring and compliance | | 6 | Copilot Access Governance | Limits exposure to ready populations | | 7 | Communication Compliance | Monitors AI-generated content quality | | 8 | Insider Risk Management | Detects malicious Copilot usage | | 9 | Retention and eDiscovery | Required for regulatory compliance | | 10 | Information Barriers | Required for specific industries |
Next Steps
These 10 controls are the security baseline, not the ceiling. Your organization's specific industry, regulatory environment, and risk profile may require additional controls. But without these 10, you are deploying an AI system that has broad access to your organizational data without adequate protection.
Start with sensitivity labels. Fix SharePoint permissions. Configure DLP. Then layer additional controls based on priority. Do not deploy Copilot to production users until controls 1-6 are in place.
If your organization needs help implementing these security controls, EPC Group has secured Copilot deployments across Fortune 500 organizations in healthcare, financial services, and government. Contact us for a security assessment.
About the Author: Errin O'Connor is the founder and Chief AI Architect at EPC Group, a Microsoft Gold Partner with 25+ years of enterprise consulting experience. He has authored four Microsoft Press bestselling books and specializes in helping Fortune 500 organizations implement Microsoft Copilot securely and at scale.
Errin O'Connor
Founder & Chief AI Architect
EPC Group / Copilot Consulting
With 25+ years of enterprise IT consulting experience and 4 Microsoft Press bestselling books, Errin specializes in AI governance, Microsoft 365 Copilot risk mitigation, and large-scale cloud deployments for compliance-heavy industries.
Frequently Asked Questions
What are the 10 critical security controls for Microsoft Copilot?
Which security controls should I implement first?
How does Copilot interact with DLP policies?
How do I detect insider threats using Copilot?
In This Article
Related Articles
Related Resources
Need Help With Your Copilot Deployment?
Our team of experts can help you navigate the complexities of Microsoft 365 Copilot implementation with a risk-first approach.
Schedule a Consultation

