Microsoft's recently reintroduced AI-powered Recall feature for Windows 11 is facing privacy concerns despite assurances from the company. Initially rolled out last month as part of the Windows 11 Insider Preview Build 26120.2415 (KB5046723) for Copilot+ PCs in the Dev Channel, the feature was designed to help users retrieve forgotten or misplaced information. However, a new report reveals that the Filter Sensitive Information setting may not always work as intended, capturing sensitive data like credit card details and social security numbers in some instances.
Sensitive Information Captured Despite Filters
According to a report by Tom's Hardware, the Recall feature sometimes fails to block sensitive information from being saved as snapshots. Instances of the feature recording private details occurred even when the Filter Sensitive Information setting was enabled:
- Credit Card Details in Notepad: When users entered fake and legitimate credit card information, such as "Capital One Visa," into Notepad, the AI tool captured screenshots of the data.
- Loan Application PDF in Microsoft Edge: While filling out a loan application that included personal details such as contact information, date of birth, and social security numbers, the Recall feature saved screenshots of the sensitive fields.
- HTML Web Form: An HTML page designed to collect credit card details was also captured by Recall, including fields for card type, number, CVC, and expiration date.
Instances of the Security Filter Working
Despite these alarming reports, Recall successfully filtered sensitive information in two instances. On payment pages for Pimoroni and Adafruit, the AI tool captured snapshots only before and after sensitive fields were completed, avoiding the actual sensitive data during the process.
Microsoft’s Privacy Commitments
Microsoft first introduced the Recall feature at its Surface and AI event in May 2023. However, it was pulled from test builds a month later due to privacy and security concerns. With the recent reintroduction, Microsoft has emphasized that:
- Snapshots captured by Recall remain on-device.
- The data is not shared with Microsoft or third-party servers.
- It is not used for AI model training or any other features.
These reassurances have not been enough to assuage concerns, as the latest findings highlight the potential risks associated with the feature.
A Step Forward or a Step Back?
While the Recall feature aims to make it easier for users to keep track of important details, its inability to consistently filter sensitive information raises serious privacy questions. The implications of these failures could be significant, especially as users grow increasingly wary of how AI handles their personal data.
Conclusion
Microsoft’s Recall feature represents a promising innovation in AI-powered convenience, but its rollout has been marred by recurring privacy issues. While the company assures users that their data is secure and stays on-device, instances of sensitive information being captured underscore the need for further refinement. As Microsoft continues to develop and enhance Recall, addressing these privacy concerns will be critical to regaining user trust.