this post was submitted on 30 Aug 2024
170 points (80.8% liked)

Privacy

31866 readers
259 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Pavel Durov's arrest suggests that the law enforcement dragnet is being widened from private financial transactions to private speech.

The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.

Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.

you are viewing a single comment's thread
view the rest of the comments
[–] einkorn@feddit.org 94 points 2 months ago (3 children)

Well, except Telegram isn't a good tool for privacy.

There is no E2EE. Simple encryption is only available for 1:1 chats and disabled by default. Telegram doesn't disclose their encryption methods, so there is no way to verify the (in)effectiveness. Telegram is able to block channels from their end, so there is no privacy from their end either.

[–] Libb@jlai.lu 11 points 2 months ago (2 children)

Well, except Telegram isn’t a good tool for privacy.

That's not the point. The hunting down on tools and their creators (and on our right to privacy) is the issue here. At least, imho.

[–] Rose@lemmy.zip 46 points 2 months ago* (last edited 2 months ago) (2 children)

It has nothing to do with privacy. Telegram is an old-school social network in that it doesn't even require that you register to view the content pages. It's also a social network taken to the extreme of free speech absolutism in that it doesn't mind people talking openly about every kind of crime and their use of its tools to make it easier to obtain the related services. All that with no encryption at all.

[–] istanbullu@lemmy.ml -1 points 2 months ago (2 children)

Free speech is good. Government regulated speech is bad.

[–] pupbiru@aussie.zone 8 points 2 months ago (1 children)

free speech can be good. free speech can also be bad. overall, it’s more good than bad however society seems to agree that free speech has limits - you can’t defame someone, for example

free speech absolutism is fucking dumb; just like most other absolutist stances

this also isn’t even about free speech - this is about someone having access to information requested by investigators to solve crimes, and then refusing to give that information

[–] istanbullu@lemmy.ml -1 points 2 months ago (2 children)

This is pure nonsense.

Western governments hate Telegram because until now Telegram didn't cooperate with Western intelligence services like American social media companies do. Everything on Meta or Google gets fed into NSA, but Telegram has been uncooperative.

This will likely change after Durov's arrest, but it was nice while it lasted.

[–] pupbiru@aussie.zone 2 points 2 months ago (1 children)

we don’t disagree about that: governments don’t like that telegram doesn’t cooperate; that’s not in dispute

where the disagreement comes is the part after. telegram (and indeed meta, google, etc) have that data at their disposal. when served with a legal notice to provide information to authorities or shut down illegal behaviour on their platforms, they comply - sometimes that’s a bad thing if the government is overreaching, but sometimes it’s also a good thing (in the case of CSAM and other serious crimes)

there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic

[–] istanbullu@lemmy.ml 1 points 2 months ago (1 children)

there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic

Was it? The French authorities did not provide any convincing evidence, just accusations.

[–] pupbiru@aussie.zone 0 points 2 months ago (1 children)

you think they’re going to link to still available (that’s the point - they’re still available) sources of CSAM?

if that’s your burden of proof then buddy i’m sorry to say there’s no way anyone’s going to convince you, and that’s not a good thing

[–] istanbullu@lemmy.ml 0 points 2 months ago (1 children)

This is the standard excuse for authoritarian governments. Use a crime category no one can object to.

[–] pupbiru@aussie.zone 0 points 2 months ago (1 children)

and this is called the slippery slope fallacy and is either a flaw in your logic or a way of arguing in bad faith. either way, it’s just fearmongering. if that’s all you’ve got then i have nothing more to say

https://en.wikipedia.org/wiki/Slippery_slope

[–] istanbullu@lemmy.ml 0 points 2 months ago

You are the one making up a fantasy scenario to satisfy your authotarian urges.

[–] octopus_ink@lemmy.ml 0 points 2 months ago (1 children)

This will likely change after Durov’s arrest, but it was nice while it lasted.

Why use a tool that relies on the goodwill of the operator to secure your privacy? It's foolish in the first place.

The operator of that tool tomorrow may not be the operator of today, and the operator of today can become compromised by blackmail, legally compelled (see OP), physically compelled, etc to break that trust.

ANYONE who understood how telegram works and also felt it was a tool for privacy doesn't really understand privacy in the digital age.

Quoting @possiblylinux127@lemmy.zip :

Other encrypted platforms: we have no data so we can’t turn over data

Telegram: we collect it all. No you can’t know who is posting child abuse content

And frankly, if they have knowledge of who is sharing CSAM, it's entirely ethical for them to be compelled to share it.

But what about when it's who is questioning their sexuality or gender? Or who is organizing a protest in a country that puts down protests and dissent violently? Or.. Or... Or.... There are so many examples where privacy IS important AND ethical, but in zero of those does it make sense to rely on the goodwill of the operator to safeguard that privacy.

[–] istanbullu@lemmy.ml -1 points 2 months ago (3 children)

ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.

Telegram is the most realistic alternative to breaking Meta's monopoly. You might like Signal very much, but nobody uses it and the user experience is horrible.

[–] pupbiru@aussie.zone 1 points 2 months ago

if metas monolopoloy is literally the only thing you care about, but replacing a terrible platform with another platform that lacks privacy protections is not much of an upgrade

[–] octopus_ink@lemmy.ml 0 points 2 months ago* (last edited 2 months ago)

Telegram is the most realistic alternative to breaking Meta’s monopoly. You might like Signal very much, but nobody uses it and the user experience is horrible.

Joke's on you, I use nothing by Meta, nor Signal, nor telegram. My comment had nothing whatsoever to do with what I like or not.

[–] possiblylinux127@lemmy.zip 1 points 2 months ago

That apparently applies to child abuse and CSAM

[–] Dark_Arc@social.packetloss.gg -5 points 2 months ago (1 children)

Questionable interpretation. Privacy doesn't mean mathematically proven privacy. A changing booth in a store provides privacy but it's only private because the store owner agreed to not monitor it (and in many cases is required by law not to monitor it).

Effectively what you and the original commenter are saying (collectively) is that mathematically proven privacy is the only privacy that matters for the Internet. Operators that do not mathematically provide privacy should just do whatever government officials ask them to do.

We only have the French government's word to go off of right now. Maybe Telegram's refusals are totally unreasonable but maybe they're not.

A smarter route probably would've been to fight through the court system in France on a case by case level rather than ignore prosecutors (assuming the French narrative is the whole story). Still, I think this is all murkier than you'd like to think.

[–] Rose@lemmy.zip 5 points 2 months ago* (last edited 2 months ago) (1 children)

It's a street, not a changing booth. Also, I'm familiar with every charge against Durov and I personally have seen the illegal content I talked about. If it's so easily accessible to the public and persists for years, it has nothing to do with privacy and there is no moderation - though his words also underscore the latter.

[–] Dark_Arc@social.packetloss.gg 2 points 2 months ago (3 children)

Who said it's a street? What makes it a street?

personally have seen the illegal content I talked about.

Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I've been on it for years.

Was it the same "channel" or "group chat" that persisted for years?

What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?

I mean by that argument Signal could do client side scanning on everything (that's an enforcement at the platform level that fits their technical limitations). Is that where we're at? "If you can figure out how to violate privacy in the name of looking for illegal content, you should."

Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.

Make no mistake, I'm not defending the content. The people who used the platform to share that content should be arrested. However, I'm not sure I agree with the moral dichotomy we've gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.

[–] possiblylinux127@lemmy.zip 0 points 2 months ago (1 children)

Telegram is in the news often for public groups with lots of crime

[–] Dark_Arc@social.packetloss.gg 2 points 2 months ago

"The news" is too vague a source to dispute.

[–] Rose@lemmy.zip 0 points 2 months ago* (last edited 2 months ago) (1 children)

I won't go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don't. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.

Edit: "Censored" the names of the bots, as I still don't want to make them even easier to find.

[–] Dark_Arc@social.packetloss.gg 2 points 2 months ago

which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data

Was that a bad thing? I've never heard the name Bellingcat before, but it sounds like this would've been partially responsible for the reporting about the Navalny poisoning?

They used the two highly popular bots called Ha and the E ** G, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5.

Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so ... actually detecting and shutting that down without it being a massive expensive money pit is difficult.

It's easy to say "oh they're hosting it, they should just take it down."

https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/

Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there's serious nuance in what's reasonable liability for a carrier ... and personally holding the CEO criminally liable is a pretty extreme instance of that.

[–] einkorn@feddit.org 26 points 2 months ago (1 children)

I am going to quote myself here:

The issue I see with Telegram is that they retain a certain control over the content on their platform, as they have blocked channels in the past. That's unlike for example Signal, which only acts as a carrier for the encrypted data.

If they have control over what people are able to share via their platform, the relevant laws should apply, imho.

[–] Libb@jlai.lu -5 points 2 months ago (2 children)

I am going to quote myself here:

Allow me to quote myself too, then:

That’s not the point.

I do not disagree with your remarks (I do not use Telegram), I simply consider it's not the point or that it should not be.

Obviously, laws should be enforced. What those laws are and how they are used to erode some stuff that were considered fundamental rights not so long ago is the sole issue, once again, im(v)ho ;)

[–] einkorn@feddit.org 24 points 2 months ago

It IS the point. If Telegram was designed and set up as a pure carrier of encrypted information, no one could/should fault them for how the service is used.

However, this is not the case, and they are able to monitor and control the content that is shared. This means they have a moral and legal responsibility to make sure the service is used in accordance with the law.

[–] Serinus@lemmy.world 1 points 2 months ago

The point is that if you're going to keep blackmail, you have to share with the government.

The easy answer is to stop keeping blackmail.

[–] clot27@lemm.ee -3 points 2 months ago (2 children)

Signal fans being edgy cool kids

[–] possiblylinux127@lemmy.zip 6 points 2 months ago

Signal has its own issues. At least it has proper encryption

[–] einkorn@feddit.org 2 points 2 months ago

Yay, let's all hate on the one crypto messenger, that is independently verifiably secure.