Our personal messages are just waiting to be made public

0
3

If a message isn’t end-to-end encrypted, you should assume that it will leak at some point. People still treat text messages as if they were whispering secrets into a vault. They send confessions, complaints, flirtations, and outbursts of rage through platforms that belong to trillion-dollar heavyweights of surveillance — and they apparently believe that this data will disappear into a benevolent digital black hole. In fact, these messages are stored on the servers of companies whose business model is based on knowing more about you than people know about their own families.


The illusion of privacy is kept alive by the brand image. Those pretty icons, lock emojis, and reassuring statements about “security” and “protection” – these are marketing tools, not security guarantees. If a service doesn’t use end-to-end encryption, messages will be visible to the company, employees, and anyone with the appropriate credentials or court order. Platforms like Gmail or Slack are nothing more than filing cabinets with glass doors. Anyone who shares something remotely sensitive is playing a risky game. And the house always wins. These companies store messages in a readable format, analyze them for promotional purposes, share them in litigation, and provide access to them to any number of internal teams, known as “moderation” or “data analysis.”

Ignoring all this, people continue to send sensitive texts, trade secrets, and personal crises through data lines that can even be marked with an “access on request” warning sign. And then they are shocked when something leaks. For example, when a celebrity’s private messages appear on X. Or, when a court case reveals the CEO’s DMs, when a disgruntled former employee posts internal chats.

End-to-end encryption is the last line of defense between privacy and digital panopticon. Apps like Signal or SimpleX Chat actually keep conversations locked. Therefore, it is really inaccessible to everyone except the sender and receiver. And this is exactly the reason why governments and tech companies hate these devices. They cannot be observed, they cannot be monetized, they cannot be manipulated. Once they send a message, it is inaccessible to them, and this is unacceptable in an age of insatiable data greed.

“I have nothing to hide.” – say people who have never read the privacy policy or thought that their inbox could someday be evidence. Privacy does not imply confidentiality. Under the guise of a “research project,” they collected over 2 billion Discord messages between 2015 and 2024, and published everything as a structured JSON file all over the internet. According to researchers at the Federal University of Minas Gerais, all of this was legally clear, completely ethical in accordance with Discord’s Terms of Service. They combed through 3,167 servers, analyzed 4.7 million users,  and  published a database of 118 GB. Now, academics can rummage through the remnants of aliens’ digital intimacy with themes of “mental health” and “political radicalization.” “All data comes from groups that are explicitly considered public under Discord’s Terms of Service.” This sentence tries to justify what is actually a violation of user expectations.

Yes, Discord has “public servers” – but no one expects their chats to  end up in searchable research databases hosted by universities on another continent. Discord researchers say they have “anonymized” the data. A stark statement at a time when even an emoji plus timestamp is enough to identify you. 

While they were busy publishing their mega dataset, another developer released “Searchcord” – a Discord dataset based on non-anonymized data.

This is how the data collection worked

  • Discord’s “Discovery” feature was used  to record all public servers (over 31,000).
  • 10% of them were randomly selected.
  • Using the Discord API, they collected conversations, memes, outbursts of anger, and midnight crises that lasted for years.
  • No malware installation, no hacking. Only brutal data collection – legally guaranteed.

It was as if every handwritten note had been copied in a public library and then exhibited in the museum of “academic intentions.” The researchers say, “This will help us research disinformation.” Or, “You can use it to train chatbots.” Or, “You can use it to better understand harmful behavior.” The real scandal is that the app  was intentionally built to fail and be accessed during repair.

If the message isn’t end-to-end encrypted, it exists as a copy on another computer. Whether it’s a curious intern, hackers, or bored PhD students. Someone is watching if they can. And most of the time they can.

Translated and edited by L.Earth

LEAVE A REPLY

Please enter your comment!
Please enter your name here