Windows AI screenshot feature 'Recall': A privacy concern?

Recall stores data in a plain text database which can be vulnerable to malware attacks
An undated image displaying the Copilot logo. — Microsoft
An undated image displaying the Copilot logo. — Microsoft

Microsoft's new AI-powered feature, Recall, has raised eyebrows among privacy advocates. Recall uses native AI models to screenshot everything you do on your computer, allowing you to find and retrieve anything in seconds. Although Microsoft promises a secure and private experience, some experts are sceptical.

This feature stores data in a plain text database, which can be vulnerable to malware attacks. This has raised concerns that Recall could be a treasure trove for hackers and cybercriminals. "It's a privacy nightmare waiting to happen," says one cybersecurity expert.

Microsoft maintains that Recall is an optional experience with built-in privacy controls. However, some argue that the feature's setup process and lack of content moderation raise concerns. Another expert warned that "Recall could potentially scrape sensitive information, such as passwords or financial data, without users realising it."

Read more: WWDC 2024: New features added on Apple Developer app update

As Microsoft prepares to launch Recall, the company must address these privacy concerns. With cybersecurity being a top priority, Microsoft must ensure that Recall does not become a liability for users. The feature's launch comes at a time when data privacy is under scrutiny, and Microsoft needs to be transparent about how Recall collects and uses user data.

Some experts suggest that Microsoft should implement additional security measures, such as end-to-end encryption and secure data storage. Others suggest giving users more control over their data, including the ability to opt out of Recall entirely.