Telegram, Privacy, and Mycroft

Originally published at: http://mycroft.ai/blog/telegram-privacy-and-mycroft/

If you follow Open Source news you may have heard recently about the protests in Russia in support of Telegram. We particularly enjoyed the paper airplanes.

What is Telegram and why are Russians protesting?

Telegram is an open source messaging app that claims to be the first to use end-to-end data encryption in 2013. It also has fully open sourced its client code.

In March Telegram announced they had reached 200,000,000 users. Very shortly after, the Russian government banned the use of Telegram for not turning over the keys to decrypt data on the secure platform.

Pavel Durov, the Telegram founder, has said that encryption happens on the client side and that there is no universal key. For this reason he says access by the Russian government will never work.

It is also interesting to note that the Russian government has used Telegram widely.

After the ban, Telegram switched IPs prompting Russian officials to go on a banning spree blocking swathes of Google Cloud and Amazon Web Services, where Telegram had shifted some of its services. At the height of the ban Russia was blocking 16 million sites causing access issues to a large number of sites Russians access daily.

To show support and solidarity for Open Source projects, we have added Telegram to our communications channels. If you would like to chat with us there we would love to see you.

 

Photo by Christian Wiediger on Unsplash

1 Like

Its misleading touting telegram as open source. Only the clients are open source. The server is proprietary and the encryption has not been vetted.

Telegram’s client-side code is open-source software but the source code for recent versions is not always immediately published,[18] whereas its server-side code is closed-source and proprietary.[19] The service also provides APIs to independent developers. In March 2018, Telegram stated that it had 200 million monthly active users.[20] According to its CEO, as of April 2017, Telegram has more than 50% annual growth rate.[21]

Telegram’s security model has received notable criticism by cryptography experts. They criticized the general security model of permanently storing all contacts, messages and media together with their decryption keys on its servers by default and by not enabling end-to-end encryption for messages by default.[22][23][24] Pavel Durov has argued that this is because it helps to avoid third-party unsecure backups, and to allow users to access messages and files from any device.[25] Cryptography experts have furthermore criticized Telegram’s use of a custom-designed encryption protocol that has not been proven reliable and secure.

[Written early in the morning on not much sleep, so please excuse the bad writing. :sleepy:]

Telegram gets a lot of hype in the media and among FOSS enthusiasts who don’t know much about crypto(**). I’m a cryptography implementer, and as far as end-to-end confidentiality goes, I believe Telegram is a harmful influence, and that it’s particularly reckless to suggest using their system against nation-state adversaries. Those are technical judgments.

Confidentiality is extremely hard for a variety of reasons (you’re literally pitting yourself against adversaries from decades into the future, for one thing)—and it’s clear that they either don’t take the challenge seriously, or they’re so full of hubris that they think they know better than all the people who are advancing the state of the art, who are telling them that their protocol needs work.

Also, over the last 15-20 years or so, the state of the art has been advancing, and we’ve learned that most of our design intuitions from the 1990s and early 2000s were catastrophically wrong:

  • We thought we should sign messages before encrypting them, and in 2001 Hugo Krawczyk proved that only the exact opposite is guaranteed to be secure.
  • We thought timing attacks wouldn’t work over a network—in 2003, David Brumley and Dan Boheh at Stanford published a paper “remote cache timing attacks are practical”.
  • We thought we were good at designing hash functions, then Xiaoyun Wang and her team broke 5 big-name hashes in 10 years, including SHA-1. (Fun trivia: When she published MD5 collisions, she initially got the endianness wrong, and people couldn’t reproduce her work. When informed of this, she basically said “oops, hold on” and updated her paper with newly-generated collisions. She wasn’t using months of computations on supercomputers; MD5 can be broken on a laptop).
  • We thought compressing data before encrypting it would make things more secure; the CRIME attack taught us that it’s actually very dangerous if an attacker can control some of the input
  • We thought encrypting without authenticating wouldn’t leak plaintext. It does.
  • We thought side-channel attacks weren’t a big deal. Now they’re everywhere—Spectre and Meltdown, the CPU design flaws in the news recently, are timing attacks.
  • We thought using lots of randomness in our designs was safer than trusting block ciphers. Decades layer, nobody’s even broken 3DES (except for the generic problem that 64-bit blocks are too small). Block ciphers have held up extremely well (except for timing side-channels, which are being addressed in new designs like ChaCha20), Several real-world systems being broken has led to the push for more determinism.
  • We thought classical Diffie-Hellman was more trustworthy than elliptic curves because of “maturity”. Then we found out that the NSA had probably broken some standardized 1024-bit DH primes (i.e. entire cryptosystems that relied on them). Meanwhile, the IETF is standardizing Curve25519 because it’s the best thing out there.

The list goes on and on. I’ve been casually following the developments for a bit more than a decade. It’s become clear that the old wisdom of “old cryptography is better because it’s more mature & has had more review” just isn’t very useful right now, because it’s only true if the old cryptography survives review. Not many things from that era have survived review.

Since about 2010, when Eric Butler released Firesheep, and then 2013 when Edward Snowden made us all aware of global mass surveillance, the big wall separating academia and industry has come crashing down. Security is no longer considered merely an issue of “compliance”; we now know that there are real, competent, and motivated attackers out there who are nation-states or have the resources of nation-states.

Telegram’s problem isn’t one or two particular technical flaws, but their entire approach to information security in general, which naturally results in technical flaws. They’ve dedicated a whole “Advanced FAQ” page on their site to defending their design weaknesses, while most other big names who matter—Google, Facebook, Mozilla, Apple, the Python community (and other language communities, but I know several pyca developers personally), and even Microsoft nowadays—the rest of the world just fixes them.

Take a look at this (“Reactions to stages in the life cycle of cryptographic hash functions”): http://valerieaurora.org/hash.html

And compare it to this: https://core.telegram.org/techfaq

… and their reaction here: https://twitter.com/bascule/status/834902630585384961

Instead of joining the growing community of crypto implementers who are learning from the world’s experts on this stuff, they’ve isolated themselves and are hostile toward everyone who has any kind of crypto expertise.

I trust Telegram less than Google or Facebook, despite the mismatch of incentives, because unlike Telegram, Google has folks like Adam Langley who have really good judgment and are busy securing the Internet. Meanwhile, the Telegram developers seem to be trying to project some ridiculous image of infallibility, regardless of the cost to end-user privacy. It’s bad and they should feel bad.

It’s a pattern of behavior that in my professional judgment, has almost no chance of producing a secure system. There are too many little details and gotchas and new developments.

tl;dr - If you want to support an open-source end-to-end encrypted messaging service, support Signal. They’re more ethical, and have already done more for privacy of real human beings (hence their current focus on mobile phones) than Telegram ever will, at this rate.

Seeing Mycroft express support for Telegram is disappointing. I hope you’ll reconsider, given their shady ethics. If you must use them, please at least refrain from promoting their “encryption”. Assume they have no end-to-end encryption, and evaluate them on that basis.

–
(**) https://twitter.com/sarahjeong/status/955651919279722496

6 Likes

I was going to say the same as dlitz, but not as close as technical. The server is not Open Source, so nobody knows whats happening there.
Signal is completely FOSS (server and client), even if moxie made some strange choices its right now the best communication choice imho.

I recently discovered Riot https://about.riot.im/ - it seems like a good approach, a bit like a open source Slack like some say. Imho a bit like Discord without talking. Its slowly growing and I’m thinking about setting up my own server.

Wire is another that I’m more comfortable with than telegram. Wire server and clients are open source and they are working on making it selfhostable and federated.

@natrius I’ve heard good things about Mastodon.

2 Likes

Yes, I love mastodon but it isn’t a messaging platform. More of a microblogging Twitter like platform.

1 Like