Skip to main content
Back to Pulse
TechCrunch

Microsoft says Office bug exposed customers’ confidential emails to Copilot AI

Read the full articleMicrosoft says Office bug exposed customers’ confidential emails to Copilot AI on TechCrunch

What Happened

Microsoft said the bug meant that its Copilot AI chatbot was reading and summarizing paying customers' confidential emails, bypassing data-protection policies.

Our Take

Gut reaction: Microsoft charged enterprises for privacy and delivered the opposite. Don't call it a "bug" — it's a design failure. The fact that confidential emails got fed to Copilot at all means nobody asked "where's this data going?" during planning.

This tanks enterprise trust in AI. GDPR auditors are already writing enforcement memos. Microsoft will patch it, but the damage sticks.

The lesson nobody's learning: encrypt at application layer *before* any data hits the model. Stop asking LLMs to handle PII.

What To Do

If you're integrating LLMs into enterprise products, assume nothing is private unless it's encrypted at the source.

Cited By

React

Loading comments...