this post was submitted on 06 Mar 2025
65 points (87.4% liked)

Privacy

34973 readers
280 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Like, there's a lot of people freaking out about Apple ending End to End encryption in iCloud in UK. I'm just like: So What? It was probably backdoored from the beginning

So is Big Tech's E2E actually not backdoored? Or is that just a PR stunt to trick people into trusting iCloud, and this is a secret honeypot? 🤔

What are your thoughts?

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 59 points 3 days ago (2 children)

There is a difference between probably backdored, and we're not event trying to look secure anymore.

[–] [email protected] 14 points 3 days ago* (last edited 3 days ago)

There's also a big difference between published specifications and threat models for the encryption which professionals can investigate in the code delivered to users, versus no published security information at all with pure reverse engineering as the only option

Apple at least has public specifications. Experts can dig into it and compare against the specs, which is far easier than digging into that kind of code blindly. The spec describes what it does when and why, so you don't have to figure that out through reverse engineering, instead you can focus on looking for discrepancies

Proper open source with deterministic builds would be even better, but we aren't getting that out of Apple. Specs is the next best thing.

BTW, plugging our cryptography community: [email protected]

[–] [email protected] 4 points 3 days ago (1 children)

I'd say the difference is minimal though.

[–] [email protected] 10 points 3 days ago

the difference is closer to maximal. only way to be worse is not just on purpose, but expertly on purpose.

[–] [email protected] 17 points 3 days ago (1 children)

In any case a service that implement E2EE is always more secure than plain data transit Even with a backdoor it's a bit more secure than regular service

But you should be angry against the UK law cause of the probable future consequences that will impose even further surveillance for other services But people that defend it are probably mainly Apple fans. Apple is "against" to maintain a "privacy" friendly company against google and meta, but it's bullshit

[–] [email protected] 9 points 3 days ago

Apple's datacenters have been untrustworthy for a long time, at this point any breaches or data sales of user data is as much the user's fault. But yeah, from what I've seen, corporations never make moves to primarily benefit the consumer, society, state law or otherwise. Only self profit. Sometimes aligning with demand improves profit.

[–] [email protected] 24 points 3 days ago

E2EE is, theoretically, secure. It certainly prevents a government from hoovering up your data when they casually cast too wide of a dragnet while "chasing a criminal". ...At least, when it is implemented honestly and correctly.

Now if governments wanted to properly backdoor some E2EE implementation; all they really need to do is compromise one end of the conversation. Of course, they want to be able to do it auto-magically; through delivering a court order to a single point; and not through busting down the door, or capturing the user of, one end or another of the conversation and compromising the device.

The question therein lies; do you as a person want the government to be forced to bust down a door? Some people think they should be forced to break doors and others do not feel that it is necessary. There are many diverse stances on this question; all with unique reasons.

It's clear to me that E2EE works properly...the governments would not be trying to "end Encryption" if it did not work. Therefore it stands to reason that E2EE is not compromised, if a government is forced to pass a law in order to compromise the encryption or turn it off entirely. That proves it works.

I just logically proved Encryption works, without even taking a stance on the matter. For the record however; I do support Encryption. I think this law undermining it is a massive governmental overreach that will quickly lead to that same government finding out how critical Encryption actually is to their people. Just give it time.

[–] [email protected] 32 points 3 days ago

If they tell law enforcement they can’t produce an unencrypted copy and it’s later proven that they could, the potential penalty would likely be more severe than anything they could have gained by using the data themselves. And any employee (or third party they tried to sell the data to) could rat them out—so they’d have to keep the information within a circle too small to make use of it at scale. And even if it never leaked, hackers would eventually find and exploit the backdoor, exposing its existence. And in either case they’d also have to face lawsuits from shareholders (rightly) complaining that they were never warned of the legal risk.

[–] [email protected] 5 points 2 days ago

Sure it's E2EE, it's encrypted with your key and the company's key and the government's key.... SAAAFE ....

[–] [email protected] 11 points 3 days ago (1 children)

I’m no expert but given the repeated efforts from governments around the world to get backdoors added to encryption and frequent pushback from big tech, or at least Apple, I’m more inclined to think there currently, or recently, aren’t backdoors. At least, not easy ones, not official ones. As an example, recall a few years ago there was a terror-related attack in the U.S. where someone tied to Muslim extremists went on a shooting spree before taking his own life (I’m not bothering to look up the details and my recollection could be flawed). The attacker used an iPhone and the U.S. government took the opportunity of strong public outrage to try to force Apple to create a tool to break the encryption on the iPhone so they could examine its contents. Apple resisted and the effort went to court, with the decision eventually being that Apple did not have to break the encryption. The government then revealed that they had access to a third party tool that they used to break into the phone and recover its contents. That’s pretty much been the pattern before and since: a government will try to find a cause that seems likely to gather widespread support and use that to get a backdoor they promise not to abuse, and the companies push back to varying degrees. All the while there seem to be third party tools that exploit various flaws, including zero-day flaws to gain the access the companies won’t provide. My impression is that at least a couple times a year there’s a story about an Apple security update patching these holes and notifying certain users if they may have been targeted.

It’s possible that’s all just theater put on by the U.S. and allies to help Apple or Google tell governments the U.S. doesn’t trust, “see, we can’t even give the U.S. government we’re subject to access, so we certainly can’t give you access.” Given some of the cases that have been used to try to force access, though, I’m more inclined to think the government really doesn’t have the easy access some might like.

Of course, it’s also possible that some of the flaws used by zero-day exploits to gain access are intentionally planted, either by the software companies or by an individual programmer acting at a government’s behest. The later patches could be to maintain appearances to outsiders, since there always seem to be additional flaws. Still, programming is hard enough and operating systems are complex enough that I’m more inclined to say that usually these really are just human error and not something malicious.

None of that is to say that anyone should fully trust these encryption systems. Used properly, they’re probably good enough against ordinary hackers, people just looking for financial rewards. You can keep your family photos, important records, school notes, etc. on them without worrying too much. Financial records you might want to doubly encrypt, just so they’re not so easy to exploit if there is a breach and data dump. If you’re doing something any government cares enough about to really investigate, they’re probably going to find a way into your computer, phone, or cloud service, depending on how motivated they are. Maybe not some impoverished “third-world” governments, but most of the big ones have some resources. I’d be extremely cautious about things that could actually send someone to jail, either in your own country or one that is less friendly.

[–] [email protected] 1 points 3 days ago

The government then revealed that they had access to a third party tool that they used to break into the phone and recover its contents.

I'm not sure if we're thinking of the same case but I also remember that the tool wasn't ready in the beginning, which is why they tried the court method until it was

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago)

We do not control software which fails to include a libre software license text file, anti-libre software. 🚩

They will shill all kinds of excuses to cope. The winners have already escaped.

[–] [email protected] 2 points 3 days ago

I'd assume that most governments (other than the US) don't have easy access even if E2EE in WhatsApp etc. is a sham - these companies are definitely intent on keeping the pretense up, and smaller countries aren't able to disprove it.