Did anyone expect otherwise? Telegram is a mostly unencrypted chat application, of course it's going to cooperate if local law enforcement comes knocking on their door with a warrant. If you don't want your chats to end up in the hands of law enforcement then you should consider using an end-to-end-encrypted messenger application.
Signal will hand over your data too if the police show up, but they don't have any data to hand over.
In a sense, yes. In case you don't speak German, the article [0] touches on this:
> Dass Telegram überhaupt Auskunft über Nutzer an Behörden erteilt, markiert zumindest eine vorsichtige Kehrtwende im Kurs des 2013 gegründeten Unternehmens. Lange bekamen deutsche Ermittler keinerlei Antworten, wenn sie wissen wollten, wer hinter Telegram-Konten steckt, die strafbare Inhalte im Netz verbreiten. Die Betreiber erklären auf ihrer Seite weiterhin: »Bis zum heutigen Tag haben wir 0 Byte Nutzerdaten an Dritte weitergegeben, einschließlich aller Regierungen.«
In a nutshell, this is considered a turning point because Telegram's official stance is (even today) that they don't share data, not even with any government. And now they did, apparently. So yes, this definitely news, if not a bit surprising.
The German government has even made hints that they will seek to ban Telegram from app stores if they continue to refuse to comply with law enforcement [1]
This is something I find surprisingly difficult to get through to people at both a personal and professional level (as I often work in security-related projects).
People speak of whether they trust this or that entity. Reality is that it's an irrelevant discussion. Trusting an organization is always misplaced, no matter who they are. If you hand over data to an organization, you must immediately assume that data is compromised. Yes, that's all SaaS and all cloud providers, for example. Any party who could see the data in unencryted form, you need to assume they have it and can potentially at some point abuse it.
This has nothing to do with the ethics of whoever is running the given organization at the moment. There are many points in time in various organizations where they truly protect the data. I thank you. But it's not a stable situation. Things can change. Less ethical management can come in during a reorg and suddenly everything is fair game and the consumer won't know until it is too late. And always there is the threat of governments demanding the data, at which point no matter how ethical the organization may be, nothing matters since there is nothing they can do.
For any data that could at any level be potentially sensitive, you must make sure it is either not sent through third parties at all, or be encrypted with keys that only you control. Otherwise, assume it is compromised and sold to every bidder.
>> In a nutshell, this is considered a turning point because Telegram's official stance is (even today) that they don't share data, not even with any government.
If you care about your privacy the "official stance" alone is close to worthless. That includes governments and corporations as well. I think we already learned that. Just like on security you need in defense in depth on privacy(i.e. hardware, software, and "official policy" as well).
Their stance has always been that they don't protect terrorism suspects, they even had a channel where they reported how many ISIS groups/channels got shut down.
If I'm a decent Human being I'd rather no company helps criminals do crime stuff.
The standards to turn over such information should not be decided by a private entity but by democratically enacted and enforced laws. Is enforcement of child porn prosecution "absolutely necessary"? Are death threats ok? Where's the line? I don't think a private company sitting in an unaccountable jurisdiction should make that decision. I trust German courts a lot more in that regard.
I'm inclined to agree, partly. If companies have the information, they should not have the last word about whether law enforcement gets access. That said, I do consider properly secure communication tools desirable and am very concerned about ongoing attempts to ban them.
The uncomfortable truth is that if a communication method isn't secure for child molesters and terrorists, it isn't really secure for anyone.
In the case of Telegram they do require you to use a mobile phone number to sign up and use their service. Mainly so they can be sure you don't abuse THEM, mostly. So the safety of criminals from law enforcement does come from Telegram refusing to give up information they have.
And this concerns an area of crimes were the perpetrators don't very actively evade detection. There are other means to do so. And in the case of Germany, openly and publicly criticizing against the state is no problem as long as you don't propose to violently overthrow the state.
> If I'm a decent Human being I'd rather no company helps criminals do crime stuff.
Does that include gas stations, supermarkets, utility companies, sporting goods stores, book stores, clothing stores? Each sells things that are used to aid in crimes. That doesn’t mean the government should spy on everybody who visits them
But different jurisdictions have different rules? For instance, are you OK with a country that outlaws homosexual acts firing hundreds of subpoenas at a company to be able to target its gay users? CSAM is also a wide spectrum offense, with many jurisdictions now banning cartoon images, CG images and pictures of dolls.
I'm not for total chaos. It's just that the world is a very complicated place.
It's less complicated if you don't do business in countries whose jurisdictions you don't trust. The European Union largely outlaws any such abuse by authorities.
Depending on tech companies to protect your data from authorities is a shitty strategy. At best it works the other way around. If not, you're screwed.
The vast majority of users of Telegram come from (and reside in) a country whose jurisdiction you probably wouldn't trust. Most of its developers, too (though they have relocated).
"Never changed"? What a ridiculous and stupid standard. Germany's constitution is one of the best at guarding individual rights, and as far as nations go, it has one of the best chances for staying that way. German courts are a much better instance for judging what is a reason to turn over personal information, than any private company ever could be, especially Telegram.
No, don't do business in Russia, Saudi Arabia or China, unless you plan on turning in your clients. Greed makes companies overlook these problems. They tend to eventually regret it, because they can't even make a decent profit when, for example, Russia does what it just did.
They collect everything, illegally and without any ethics.
Then they’re incompetent as well, so action doesn’t happen when it should.
Instead we get capricious, haphazard enforcement, and a dark future with all of this stored in permanent record to be retrieved using the search tools of 2040 and weaponized later when we later make nuisances of ourselves in response to future legislation.
I know, and I agree. MTProtov1 criticisms aside, the E2EE system Telegram uses is perfectly safe.
It's just disabled by default, unavailable in group chats or channels, and enabling it reduces usability (i.e. you can't use multiple devices to chat if you enable E2EE).
Telegram as a chat app has the best UX of any chat app out there in my opinion, so the lack of proper E2EE is simply disappointing. I don't really trust either, but I consider WhatsApp more secure than Telegram, despite Meta mining my metadata.
This makes no sense. There is nothing to prove that that WhatsApp is really using the aformentioned Signal code. It's closed source, so it could be anything inside.
Your objection to a perfectly cromulent answer is just namecalling. People take apart and find vulnerabilities, including cryptographic ones in closed source software all the time.
It's a closed source app. Translation: you and I don't know what it does. Take it apart all you want, but you're not going to find any backdoors or learn about how good the E2EE implementation is. Claiming otherwise is ridiculous.
It's a closed source app. Translation: you and I don't know what it does.
This just isn't true. It's a claim trivially disproved in public by, say, gazillions of detailed P0 posts. Again, all you have is namecalling and confidently stated things with obvious counter-examples.
Claiming otherwise is ridiculous.
Yours is an extraordinary claim that requires, never mind extraordinary, any evidence.
This is my point exactly. It doesn't matter as long as the app is closed source. You could do the worlds most secure E2EE implementation, but then send a copy of all keys to Facebook servers.
It's trivially easy to encrypt the traffic and then send a copy of the private key to Facebook's servers. You would not be able to decrypt it, but they would.
If your message in WhatsApp gets reported approximately 1000 employees from Facebook/Meta will be able to read your last 5 messages you made in WhatsApp.
What? It’s not E2EE by default? This is honestly disappointing. So most of their privacy-focused stance is just a marketing ploy? A lot of my peers now use Telegram to move away from the Facebook ecosystem and now you’re telling me they’re just as worse?
Everyone will likely hand over whatever they can... which means the most decentralized options which leave as little to hand over as possible are best.
Both are important. When governments mandate back doors such as in EU chat control or US Earn it act, centralized services can be targeted much more easily than the thousands of xmpp and matrix servers running around the world.
What distinction do you mean? Federation allows for interoperable decentralization. Without federation, we would have thousands of chat/mail/social media servers that can't talk to each other. Some may choose not to federate, but most want to federate to create a useful protocol.
I always believed decentralized was if each user doubled as a server and required no external setup (example: scuttlebutt). Whereas federated was a plurality of servers with users communicated with each other but no central authority (email, mastodon, matrix, etc). However, reading some peer to peer literature like that 1500 page behemoth of a book, "Handbook of Peer-to-Peer Networking", it seems they are used relatively interchangeably..
> Telegram to temporarily refuse data requests from Hong Kong courts amid security law
The headline you linked makes it clear it was temporary. Which mean they do of course cooperate in normal situation. Otherwise they would be blocked everywhere, you cannot maintain a service such as a chat application without cooperating with governments.
There is no story here, Telegram shared information with BKA in cases of terrorism and child abuse, as every service operating in Germany would and should do.
It’s easy to refuse requests from Chinese law enforcement if you don’t have offices, employees, or assets in China. Meanwhile, European countries recognize the legitimacy of each other’s court systems and will enforce judgments and orders across borders.
It really doesn't matter, or in general you can never be sure that any app that stores any data won't ever meet requests of any officials or any people with guns.
Especially the case of Telegram was quite simple, since SEC filed a complaint we could clearly see who are the main investors. It's not necessarily about the country or investors either.
You can choose only the place where things are stored and expect the company to act according to local laws (for e.g. Protonmail doing its proton things in Swiss judiciary).
And, I guess, a thing we have to teach people is something vague and unclear like post-privacy scene, like how one has to operate knowing that pigeon mails can always be spoofed, no matter how encrypted the conversation is.
These days, any popular messaging app that won't cooperate with local law (by choice or by design in case of e2e) would just be banned/ removed from stores in Germany or most other countries.
Strong encryption for wide audience cannot exist today.
They also have the user's phone number and in most cases their phone's adress book.
It's not really about breaking e2e encryption but rather "reveal the identities of following users please". That could be due to all sorts of illegal activities (eg. hate speech, sale of banned substances, terrorism, child abuse, etc.)
Signal cannot in fact provide your address book to German (or any other) authorities. The whole point of Signal's design, and the reason it's less featureful than things like Telegram, is that it's designed not to collect serverside metadata about who's talking to who.
The client has access to the address book and it is hard to verify what the client does in reality. I receive updates of the client every other day and who knows what it brings with it.
I haven't tried to compare a local Android build to the published version myself, so can't directly confirm the accuracy of this document.
Either way, I agree that a released build can slip by unnoticed by most users. This is not a problem unique to Signal though.
At least with Signal you have the option to verify a build before updating. You can also build and run the entirely open source client yourself, which makes verification redundant.
What you write, is I think the main point in this case: there is no hint in the article that telegram is providing access to the communications themselves, but rather data about the account holders which might let the authorities determine their identity. The communications themselves are already in the possession of the authorities. Be it, because they were direct recipients as members of the groups the communications being sent to or provided to the officials by a recipient.
Mate I love telegram but that's not plain incorrect.
Telegram has transport layer encryption, like literally everything else in 2022. For all intents and purposes telegram can read and access a majority of your conversations on it.
This isn't a super big deal because telegram is aiming to be a social media platform, rather than an encrypted comms platform, and e2ee on groups over a certain size is pretty useless.
I think telegram can still improve by making private messages e2ee by default.
Messages in iCloud, which keeps a user’s entire message history updated and available on all devices, also uses CloudKit end-to-end encryption with a CloudKit service key protected by iCloud Keychain syncing. If the user has enabled iCloud Backup, the CloudKit service key used for the Messages in iCloud container is also backed up to iCloud to allow the user to recover their messages, even if they have lost access to iCloud Keychain and their trusted devices. This iCloud service key is rolled whenever the user turns off iCloud Backup.
If you are in this ecosystem, and feel your potential loss from disclosure is greater than your potential loss by losing/damaging your device, go turn off iCloud Backup — and make sure your keychain is secured to your needs.
Hq location can be quite irrelevant. Legal intercept laws can be quite old-fashioned and might make a case than two German citizens having a conversation while on German soil makes the conversation fall under German jurisdiction. There can be a surprisingly large number of ways the jurisdiction can be determined, for all parties involved and, without analysis of German law, I would not readily make assumptions as to if they have a legal basis to talk to Signal or not. And if they do, I’m sure Signal is a law-abiding company.
That's what the other user said and it is still incorrect. [0] People either don't read the basic FAQ or conflate E2EE to being the only encryption in the world, which is ridiculous.
Encryption in transit is assumed, and rightfully so. That still means that telegram gets full access to the plaintext and as such is able to give that information to anyone, and do with it as they wish.
I suppose there are some people pit there that think "unencrypted" here means everyone can listen in, but certainly not the hackernews crowd.
I read the FAQ and even skimmed the MTProto 2.0 docs but from where I stand this Server-Client encryption sounds like encryption in transit but the server still has the ability to decrypt.
This, from a privacy against law enforcement perspective (which is what the article and comments are about), is more or less the same as no encryption.
Edit: s/transport/transit/, add "perspective" to the last paragraph.
It’s true that Telegram only uses encryption for data in transit for normal person-to-person chats and group chats. Data at rest is stored in a way the server can read. That’s one of the things that makes Telegram search so fast.
The encryption part [1] is covered in the FAQ, along with more details.
Also see the question and answer on “Fo you process data requests?” [2]
Telegram has a feature called secret chats, which are only person-to-person. That uses end-to-end encryption.
I'm aware of Secret Chats, but there's extra friction to enable it and I suspect most Telegram users are not aware of them at all - or are unwilling to use them for almost everything.
Also they should now update that FAQ answer where they say:
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
In fact, if the OP is indeed true, they should probably update the entire answer since it's misguiding at best, and an outright lie at worst.
You don’t understand what it means.
Server side encryption does not matter from the user perspective. Telegram has all the keys and they can access all the data, so there is no real privacy.
For E2EE, you need to open seperate 1 on 1 chat, which is optional, not default.
And what it comes to group chats or channels, none supports E2EE.
Server-side encryption = encryption. The fact that you don't find it sufficient and other opinions are irrelevant when it comes to people just plain wrongly stating things, such as "unencrypted" for clearly encrypted data.
It's like going outside in the rain, getting wet and saying "Well, it's not actually raining, I didn't get a pint of water in my boots."
> Server-side encryption = encryption. The fact that you don't find it sufficient and other opinions are irrelevant when it comes to people just plain wrongly stating things, such as "unencrypted" for clearly encrypted data.
We have clearly talked about E2EE (end-to-end encryption) and server side encryption is not that. E2EE means that it is encrypted between you and the message target. Server is the middle man, which should not have the access.
Almost everything is already encrypted with TLS on the current world during transmissions and regulations require server side encryption. It is not even our main interest to talk about that anymore, we are past that.
The main issue on the original post is the lack of E2EE.
Look up Grice's Maxims sometime. Conversations have context. The context here is a comment section for an article about a nation state requesting chats from Telegram. The only relevant kind of encryption that would be able to prevent this is end-to-end encryption; in such a context, 'Telegram is unencrypted' is easily and near-universally understood to refer to E2E encryption, even if absent such context the meaning would be less clear.
A better rain analogy would be someone saying 'I'd like to go for a smoke, is it raining', and you reply 'yes' because there is somewhere in the world where it is raining (just not there). You would be technically correct, but in the context of the question, the person was clearly interested in whether it was raining _there_.
But that's not "real" encryption. You're just abusing language — as most are in this thread — to get a result you want.
If you want to discuss E2EE, do so but it does not make it more "real" than other encryption.
Unencrypted is false. Not E2EE is true. Most use the former to wage war against an app they don't like because they prefer an app like Signal that satisfies their desirable qualities. Moxie actually started this trend and it is despicable. I'd say the exact same thing if Durov started referring to E2EE as "pedo-encryption" or anything else that distorts meaning.
Useless encryption is the same as no encryption.
If you put the key next to the lock, it's nit locked.
It's an abuse of language to call that encryption because if you say encryption you imply security.
But this is not secure and if it's not secure encryption is useless because security is the reason for encryption.
Encryption is not used for the sake of encryption but to protect the content of a message from unwanted access.
> Encryption is not used for the sake of encryption but to protect the content of a message from unwanted access.
Yes, that is what Telegram is doing. It may not be protecting the contents from who you want it protected from (everyone but you and the message recipient) but it does protect the contents from other (notice I did not say all) adversaries Telegram and its users don't want accessing.
It is still encrypted so use correct language, please and do not weaponize words to your own designs.
The context doesn't change the definition of encryption.
> It is more likely that you are trying to weaponize the words for your own designs.
Please point to where I have weaponized a word because on its face that accusation doesn't make any sense. I have not decided encryption means unencrypted. I have doggedly insisted words be used appropriately and even went so far as to give an example of mischaracterization of E2EE where I would call someone out.
If we go by definitions, it is not encrypted.
Ideally encryption means the process of encoding when only authorized parties can understand the information.
During the transportation of the information for the target recipient, the data in this case is on plaintext at some point on Telegram's server, and therefore it is not encrypted for the whole duration, going against the idea of transferring or holding information only for authorized parties in ciphertext format.
If we think that Telegram is the targeted party, then it would be encrypted as data is transferred or hold in ciphertext format for the whole process. However the Telegram is no the target, and the encryption is removed in the middle of process.
> Please point to where I have weaponized a word because on its face that accusation doesn't make any sense. I have not decided encryption means unencrypted. I have doggedly insisted words be used appropriately and even went so far as to give an example of mischaracterization of E2EE where I would call someone out.
You brought it up in the first place with a twisted definition.
From Wikipedia which you quoted bits from: "In cryptography, encryption is the process of encoding information. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Ideally, only authorized parties can decipher a ciphertext back to plaintext and access the original information."
> You brought it up in the first place with a twisted definition.
I did no such thing. You appear to be confusing idealism with the definition of encryption.
In any case we already have words for transport encryption, encryption at rest, and end to end encryption when referring to modes of encrypted data. Those are sufficient to cover the spectrum of encryption which exists. Calling encryption of one mode "unencrypted" which is not your ideal mode of encryption is disingenuous at best.
> conflate E2EE to being the only encryption in the world
It is the only relevant one. Nobody who cares about protected messages would be satisfied with untrustworthy encryption.
Sure, technically even a messenger using Caesar cipher is encrypted, but most people expect more than a ticked checkbox.
No real user cares about what technically still counts as encryption, just like nobody outside of biology cares whether walnuts are actually nuts.
Hop-by-hop encryption is practically useless in a secure messaging setting, and people shouldn't take the "TLS counts as encryption" argument seriously. But it's good Telegram advocates keep making it, because it's an easy way to sum up their security posture.
Not GP. But I think your comment would be more meaningful if you elaborated on “people” (like which people you’re referring to). Telegram markets itself as a secure messenger and its CEO has written many a times about WhatsApp being worse for security and privacy. I don’t think a non-tech person can differentiate well between these.
Telegram is outright lying of course. I can't remember if WhatsApp uses E2E encryption by default still or not. If not they are equivalent, but telegram isn't better in any meaningful way.
Nah, when someone calls it an unencrypted messenger, one can assume they mean it's unencrypted on the server, as in-transit encryption is ubiquitous and thus a meaningless signifier.
Yes it can. If anyone reads "encrypted messenger" they're assuming only they and the intended recipients can decrypt it.
Rather, this is more of a debate of what the layman expects, and frustration with misleading marketing. A great example of this is the whole Zoom debacle; they claimed it was encrypted, people assumed it was E2EE, and got a lot of blowback for that to the point that they ended up implementing E2EE.
Another great example: a few of my friends were using Telegram for a while, and thought it was E2EE until I pointed out that only their "Secret Chat" feature is E2EE.
Even if that were the case, I'd still agree with OP's wording that it's a mostly unencrypted chat. It's encrypted at transit for the milliseconds it takes to reach the server. Once on the server, a third party has access to the plaintext until the end of time. It's a minimally encrypted chat.
And if the wording wasn't precise enough, context still matters more in this case. I'm sure everyone here knew what was meant, despite the familiarity with cryptography. Telegram claims your messages are "heavily encrypted" which is just false, aside from their very limited secret chat feature.
HN prefers substantive discussion, not nitpicking over semantics.
is a mostly unencrypted … of course it's going to cooperate if local law enforcement comes knocking on their door with a warrant
I don’t see how these are connected. All messengers will hand over all metadata they have to comply. The chatgroups in focus themselves are mostly public groups, you don’t have to play james bond to read what’s there. LEAs are arriving to the scene from that vector, not the other way round. “The NGO CeMAS monitors 3,000 German-language channels & groups for "disinformation, antisemitism, and right-wing extremism." - it’s literally in the tweet, man.
Metadata logging is unrelated to encryption, not sure what’s the sensation is without comparing what messengers will actually have on hands in case of a warrant, minus publicly accessible info.
It's not end-to-end encrypted. Of course the communication between the client app and the server is encrypted using TLS.
But this allows Telegram itself to see the content of the conversation when it arrives on their servers. This has piked the interest of LEA, who want continuous, real-time access to that information.
Signal will hand over your data too if the police show up, but they don't have any data to hand over.