Ethereum

Security experts say the Microsoft recall is spyware and an obvious target.

When Microsoft announced the new recall feature at its annual developer conference on Monday, the news made waves beyond the AI ​​industry. Cybersecurity and privacy experts have also taken note and expressed concern.

In the presentation, Microsoft said the new Copilot-enabled computers will be able to remember everything you see on your screen, including emails, websites, and applications, and find them later through AI-indexed snapshots stored on the device. Reaction to the announcement was mixed, with some security experts calling the feature spyware and a natural target for cybercriminals.

“This is a company that wants to record literally everything you do on your computer,” Geometric Intelligence founder and CEO Gary Marcus wrote on Twitter. “If you don’t think a Microsoft recall, local or otherwise, will be one of the biggest cyber targets in history, you’re not paying attention.”

“I’m really glad you helped me remember why Microsoft doesn’t use Windows: It disables all the ‘smart’ features it adds,” said Linus Tech Tips writer. Emily Young wrote.

“Back in the day, we called this spyware,” wrote Molly White, a software engineer, cryptocurrency researcher, and critic.

Recall captures everything that appears on your screen, including your password unless the app or form automatically obscures it.

Cybersecurity expert Katelyn Bowden said while it’s a simple premise, implementing the recall requires a lot of caution. decryption.

“Recall appears to work like a traditional search engine with an expanded history range,” Bowden said, noting that Recall only works on PCs with certain hardware configurations. “Your PC already has a browser history and a file content index, and this is functionally no different. And unlike many files, Microsoft says the recalled data is encrypted.

“If Microsoft offloads training set processing or starts collecting data for use in recommendation engines or off-machine models, user privacy could be violated,” she said.

Bowden is a member of the hacker collective known as the Cult of the Dead Cow and is also the chief marketing officer of the open source privacy-focused Veilid Project. She said more transparency about how AI tools are used is essential.

“I always feel comfortable when companies developing AI products are transparent about what datasets were used to train their software,” Bowden said. “I am concerned about Microsoft’s lack of transparency surrounding this. If people don’t know what data was used to train the model, they shouldn’t submit the model.”

While OpenAI, Google, and Microsoft are working hard to bring generative AI products to market, several projects and groups are providing decentralized, open source alternatives, including Venice AI, FLock, PolkaBot AI, and the Superintelligence Alliance.

Ethereum co-founder Vitalik Buterin said this morning that open source AI is the best way to avoid a future where “most human thoughts are read and arbitrated by a few central servers controlled by a few people.” I wrote:

“People should assume that everything they write on OpenAI will be passed on to them and they can own it forever,” Venice AI founder and CEO Erik Voorhees previously said. decryption. “The only way to solve this problem is to use a service where the information never moves to a central repository in the first place.”

Edited by Ryan Ozawa.

generally intelligent newsletter

A weekly AI journey explained by Gen, a generative AI model.

Related Articles

Back to top button