
Microsoft confirmed a bug allowed its Copilot AI to summarize customers’ confidential emails without permission. The issue, tracked as CW1226324, has been active since January. It bypassed data loss prevention policies designed to protect sensitive information from ingestion into Microsoft’s large language model. The bug specifically affected Copilot Chat’s ability to read and outline draft and sent emails marked with confidential labels within Microsoft 365. Microsoft began rolling out a fix in February.
The vulnerability was first reported by Bleeping Computer. Copilot Chat enables paying Microsoft 365 customers to use an AI-powered chat feature across Office software products, including Word, Excel, and PowerPoint. Microsoft stated that draft and sent email messages “with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat.” A spokesperson for Microsoft did not respond to a request for comment. This request included a question regarding the number of customers affected by the bug. Microsoft did not disclose the specific count of impacted customers.
Featured image credit






























