AI Could Undermine Private Messaging Apps Encryption


AI Could Undermine Private Messaging Apps Encryption


Artificial intelligence, a lack of awareness of data privacy, and regulatory pressures are among the biggest threats to the future of private messaging, says Alex Linton and Chris McCabe, executives from the decentralized messaging app Session.

The EU’s efforts to mandate the scanning of private messages through its Chat Control legislation have been heavily criticized by privacy advocates, but Linton, president of the Session Technology Foundation, told Cointelegraph that AI is another front that needs to be pushed back.  

AI’s capacity to analyze information on a device and store that data creates “huge privacy issues, huge security issues,” and the ability to communicate privately could basically be rendered “impossible to do on an average mobile phone or an average computer,” Linton said.

“If it’s integrated at the operating system level or higher, it might also be able to completely bypass the encryption on your messaging app, that information could be fed off to a black box AI, and then from there, God knows what happens to it,” he added. 

“It’s important that we push back against this type of deep integration of AI into all of our devices, because at that point, you just don’t know what is happening on your device anymore.”

Linton said the problem can often be exacerbated when lawmakers take advice on addressing these privacy concerns from the tech giants who are responsible for pushing the technology onto users in the first place.

How your online data is used 

McCabe, Session’s co-founder, said that many people are unaware of how their online data is stored and used, as well as the dangers of mass data collection by big tech companies.

Session’s co-founder, Chris McCabe, said many users are unaware of how their data is used after big tech companies collect it. Source: YouTube 

ChatGPT creator OpenAI disclosed last month that a third-party data analytics provider was breached by an attacker, exposing some of its user data, which it warned could be used for phishing or social engineering attacks.

A now-deactivated feature of the chatbot was also found to be sharing chat histories on the open web.

“A lot of people are unconscious of what’s going on with their data, how, what you can actually do with someone’s data, and how much money you can make of that,” McCabe said.

He added that data can be used to “manipulate people through things like advertising, or doing things they don’t even realize they do or don’t want to do based on their data.”

Linton added that raising awareness, making people aware of privacy as an issue, and helping them understand the tools available is a key part of their work.

“There is a lot of pressure if you’re in the business of building encrypted messengers or making encrypted tools in general. Proposed or enacted regulations are being adopted in many jurisdictions,” Linton said. “There’s a lot of negative media attention that can come with it.”

“The literal people working on this technology feel that pressure, so it’s important for the general public to understand these tools are trying to help. They’re trying to safeguard your information. They’re trying to make the online space a better place.”

Part-time tech nerds to full-time privacy advocates 

McCabe said the idea for Session was born from a desire to use decentralized technology in a meaningful way and to combat privacy-related issues.

He was an electrician and “a part-time tech nerd” in his spare time, but a redundancy from his job opened the door to going “all in on Web3,” and he started building Session in 2018.