<task-notification>
<task-id>a7954c4769f2c1751</task-id>
<tool-use-id>toolu_01WSjWbNANmkmUSeohrHWNLv</tool-use-id>
<output-file>/private/tmp/claude-501/-private-var-folders-t6--sx-03q50-v9w2jpnq683jmw0000gn-T-claude-run-fhamknw0/9e0c4d0f-2cce-4525-b60e-fee3c2e389e0/tasks/a7954c4769f2c1751.output</output-file>
<status>completed</status>
<summary>Agent "Microsoft Copilot retention policy" completed</summary>
<result>I now have comprehensive data from all official sources. Here is the complete research report on Microsoft Copilot's data retention policies.
---
## Microsoft Copilot (Edge Browser AI Assistant) -- Data Retention Policy Research
**Scope:** Consumer/personal Microsoft Copilot (the AI assistant in Edge browser), not Microsoft 365 Copilot for enterprise (which has materially different policies). Enterprise-specific distinctions are noted where relevant.
---
### 1. CONVERSATION/PROMPT DATA RETENTION PERIOD
**Default retention: 18 months**
Exact quotes:
- **"By default, we store conversation activity for 18 months."**
Source: [Privacy FAQ for Microsoft Copilot](https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f)
- **"Copilot retains the last 18 months of interactions in your conversation history."**
Source: [Conversation history in Microsoft Copilot](https://support.microsoft.com/en-us/topic/conversation-history-in-microsoft-copilot-9a07325a-0366-4c2d-82cb-dab61be8287c)
**Uploaded files retention -- DISCREPANCY FOUND between official pages:**
- The Privacy FAQ and File Upload support page state: **"Files you share with Copilot will be stored securely for up to 18 months and then automatically deleted."**
Source: [File upload in Microsoft Copilot](https://support.microsoft.com/en-us/topic/file-upload-in-microsoft-copilot-8b7bf432-9576-4b16-9dee-6c19a4169e62)
- However, the Copilot marketing privacy page states: **"the file will be stored securely for up to 30 days and then automatically deleted"**
Source: [Protecting Your AI Security, Privacy, and Data](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/privacy)
**Images:** **"it's processed to complete your request and may be retained for up to 30 days, or until you delete it -- whichever comes first"**
Source: [Protecting Your AI Security, Privacy, and Data](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/privacy)
**Shared conversation links:** **"shared links automatically expire after 28 days for anonymous users, and 18 months for signed in users"**
Source: [Protecting Your AI Security, Privacy, and Data](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/privacy)
---
### 2. MODEL TRAINING -- WHETHER DATA IS USED AND RETENTION
**Consumer Copilot: YES, data is used for model training by default (with opt-out available)**
Exact quotes:
- **"Microsoft uses data from Bing, MSN, Copilot, and interactions with ads on Microsoft for AI training. This includes anonymous search and news data, interactions with ads, and your voice and text conversations with Copilot."**
Source: [Take Charge of Your Copilot Experience](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/do-more-with-ai/general-ai/take-charge-of-your-copilot-experience)
- **"Before training these AI models, we will remove information that may identify you, such as names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses."**
Source: [Transparency and Control in Consumer Data Use (Aug 2024 blog)](https://www.microsoft.com/en-us/microsoft-copilot/blog/2024/08/16/transparency-and-control-in-consumer-data-use/)
**Users EXCLUDED from model training (no opt-out needed):**
- **"We do not train Copilot on data from the following types of users: Users signed into Copilot with an organizational Entra ID account... Users under the age of 18... Users who have opted out of model training."**
Source: [Privacy FAQ for Microsoft Copilot](https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f)
- Also excluded: **"If you are not logged into Copilot with a Microsoft Account or other third-party authentication, we do not train Copilot on your conversations"**
Source: [Take Charge of Your Copilot Experience](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/do-more-with-ai/general-ai/take-charge-of-your-copilot-experience)
- Also excluded: Users in Brazil, China (excl. Hong Kong), Israel, Nigeria, South Korea, Vietnam.
**Uploaded files are NOT used for training:** **"Microsoft does not use the content of the files you upload to Copilot for model training."**
Source: [File upload in Microsoft Copilot](https://support.microsoft.com/en-us/topic/file-upload-in-microsoft-copilot-8b7bf432-9576-4b16-9dee-6c19a4169e62)
**Enterprise/M365 Copilot: NO training at all:**
- **"Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot."**
Source: [Data, Privacy, and Security for Microsoft 365 Copilot](https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy)
- **"Copilot Chat does not use your data to train foundation models."**
Source: [Microsoft 365 Copilot Chat Privacy and Protections](https://learn.microsoft.com/en-us/copilot/privacy-and-protections)
**No separate "model training retention period" is disclosed.** Microsoft does not publish a distinct retention timeline for how long data is retained specifically for model training purposes. The only stated general retention period is the 18-month conversation activity window.
---
### 3. "ACTIVITY DATA" vs. "MODEL TRAINING DATA" DISTINCTION
Microsoft draws a clear line between these two categories:
**Activity data** = conversation history stored for 18 months, used for:
- **"to troubleshoot problems, diagnose bugs, prevent abuse, and to monitor, analyze, and improve performance"**
Source: [Privacy FAQ for Microsoft Copilot](https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f)
**Model training data** = a subset use of that activity data, which can be toggled independently:
- **"You can opt-out of model training and still have personalization turned on."**
Source: [Privacy FAQ for Microsoft Copilot](https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f)
**Critical distinction -- opting out of model training does NOT affect other data uses:**
- **"This setting will not exclude your conversations from being used for other general product or system improvements nor from use for advertising, digital safety, security, and compliance purposes as outlined in the Microsoft Privacy Statement."**
Source: [Microsoft Copilot privacy controls](https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6)
In other words: even after opting out of model training, Microsoft retains conversation activity for 18 months and continues to use it for product improvement, advertising, safety, and compliance. The opt-out only removes the "train generative AI models" use case.
---
### 4. OPT-OUT / DELETION MECHANISMS AND WHAT THEY ACTUALLY DO
Microsoft provides four independent privacy controls:
**(a) Model Training Opt-Out**
- **"Opting out will exclude your past, present, and future conversations from being used for training these AI models"**
Source: [Take Charge of Your Copilot Experience](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/do-more-with-ai/general-ai/take-charge-of-your-copilot-experience)
- **"If you opt out, that change will be reflected throughout our systems within 30 days"**
Source: [Take Charge of Your Copilot Experience](https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/do-more-with-ai/general-ai/take-charge-of-your-copilot-experience)
- However, the Support article uses narrower language: **"Opting out will exclude your future conversation activities from being used for training these AI models."**
Source: [Microsoft Copilot privacy controls](https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6)
**NOTE:** There is a discrepancy between these two pages on whether past conversations are also excluded (the marketing page says "past, present, and future"; the support page says only "future").
**(b) Personalization / Memory Toggle**
- **"If you turn off personalization and memory, Copilot will forget its memories of your conversations."**
Source: [Microsoft Copilot privacy controls](https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6)
- Memory can be deleted independently: **"To delete everything from memory in Copilot: Select your profile icon, then select Memory > Delete all Memory. Your conversation history will not be deleted."**
Source: [Microsoft Copilot privacy controls](https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6)
**(c) Conversation History Deletion**
- **"You can delete individual conversations within your conversation history or delete your entire conversation history at any time."**
Source: [Privacy FAQ for Microsoft Copilot](https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f)
- Full deletion path: **"go to Manage your Copilot activity history... select Delete all activity history"**
Source: [Conversation history in Microsoft Copilot](https://support.microsoft.com/en-us/topic/conversation-history-in-microsoft-copilot-9a07325a-0366-4c2d-82cb-dab61be8287c)
**NOTE:** Microsoft does not clarify whether deletion is immediate at the backend or merely removes the user-facing record. No statement about whether already-trained models are affected by deletion.
**(d) Personalized Ads Toggle**
- **"You can control whether you see personalized ads in Copilot and other Microsoft services...by switching the toggle off under See ads that interest you."**
Source: [Microsoft Copilot privacy controls](https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6)
---
### 5. CHANGES FROM 2025-2026
Based on the [Change history for Microsoft Privacy Statement](https://www.microsoft.com/en-us/privacy/updates):
**(a) October 2025:** Microsoft **"substantially redesigned the 'Artificial Intelligence and Copilot capabilities' section"** of the Privacy Statement to clarify multiple Copilot service offerings, data usage, and user control options. Removed references to "Copilot Pro" (now folded into M365 Copilot). Renamed "Microsoft Copilot in Bing" to "Copilot Search."
**(b) March 2026:** **"Modified language in 'Our Retention of Personal Data' regarding criteria for determining retention periods"** and **"Removed retention period descriptions from Bing and Advertising sections, redirecting to product documentation."**
**(c) January 7, 2026 -- Anthropic added as subprocessor:** **"Starting January 7, 2026, Anthropic is a subprocessor for Microsoft 365 Copilot."** This means Anthropic models now process some M365 Copilot queries under the Microsoft Product Terms and Data Protection Addendum. Per Microsoft: **"Anthropic models are out of scope for the EU Data Boundary and when available, in-country LLM processing commitments."**
Source: [Data, Privacy, and Security for Microsoft 365 Copilot](https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy)
**(d) August 2024 (foundational change):** Microsoft announced it would begin training on consumer Copilot data with opt-out controls. **"We will start providing these opt-out controls in October, and we won't begin training our AI models on this data until at least 15 days after we notify consumers that the opt-out controls are available."**
Source: [Transparency and Control in Consumer Data Use](https://www.microsoft.com/en-us/microsoft-copilot/blog/2024/08/16/transparency-and-control-in-consumer-data-use/)
---
### FLAGGED AMBIGUITIES AND COMPLIANCE RISKS
1. **File retention discrepancy:** The marketing privacy page (microsoft.com/en-us/microsoft-copilot/for-individuals/privacy) says "up to 30 days" for uploaded files. The Support FAQ and file upload help page say "up to 18 months." These are both current official Microsoft pages. For a compliance review, the longer period (18 months) should be assumed.
2. **Past vs. future conversations in model training opt-out:** The marketing page says opt-out covers "past, present, and future" conversations. The support page says only "future conversation activities." The more conservative reading (future only) should be assumed for compliance.
3. **30-day propagation window:** After opting out of training, the change takes "within 30 days" to propagate. Data may still be used for training during that window.
4. **No stated retention period specifically for training data.** Microsoft does not disclose how long data used for training is retained in training pipelines before deletion, or whether deletion of conversation history affects data already ingested into training pipelines.
5. **"Product improvement" carve-out persists after opt-out.** Even after opting out of model training, Microsoft retains and uses conversation data for "other general product or system improvements" -- this is a broad category with no defined scope.
6. **Anthropic subprocessor (Jan 2026) creates a new data flow.** Enterprise M365 Copilot data may now be processed by Anthropic models, which are explicitly out of scope for EU Data Boundary and in-country processing commitments.
---
### SOURCE URL INDEX
| # | Page | URL |
|---|------|-----|
| 1 | Privacy FAQ for Microsoft Copilot | https://support.microsoft.com/en-us/topic/privacy-faq-for-microsoft-copilot-27b3a435-8dc9-4b55-9a4b-58eeb9647a7f |
| 2 | Microsoft Copilot privacy controls | https://support.microsoft.com/en-us/topic/microsoft-copilot-privacy-controls-8e479f27-6eb6-48c5-8d6a-c134062e2be6 |
| 3 | Conversation history in Microsoft Copilot | https://support.microsoft.com/en-us/topic/conversation-history-in-microsoft-copilot-9a07325a-0366-4c2d-82cb-dab61be8287c |
| 4 | File upload in Microsoft Copilot | https://support.microsoft.com/en-us/topic/file-upload-in-microsoft-copilot-8b7bf432-9576-4b16-9dee-6c19a4169e62 |
| 5 | Protecting Your AI Security, Privacy, and Data (marketing) | https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/privacy |
| 6 | Take Charge of Your Copilot Experience | https://www.microsoft.com/en-us/microsoft-copilot/for-individuals/do-more-with-ai/general-ai/take-charge-of-your-copilot-experience |
| 7 | Data, Privacy, and Security for Microsoft 365 Copilot | https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy |
| 8 | Microsoft 365 Copilot Chat Privacy and Protections | https://learn.microsoft.com/en-us/copilot/privacy-and-protections |
| 9 | Transparency and Control in Consumer Data Use (blog) | https://www.microsoft.com/en-us/microsoft-copilot/blog/2024/08/16/transparency-and-control-in-consumer-data-use/ |
| 10 | Change history for Microsoft Privacy Statement | https://www.microsoft.com/en-us/privacy/updates |
| 11 | Microsoft Privacy Statement | https://www.microsoft.com/en-us/privacy/privacystatement |</result>
<usage><total_tokens>41353</total_tokens><tool_uses>28</tool_uses><duration_ms>236046</duration_ms></usage>
</task-notification>