this post was submitted on 14 Oct 2023
425 points (98.9% liked)

Privacy

31258 readers
666 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/6469594

How to contact your MEP.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 103 points 11 months ago (2 children)

Terrorists will have no problem writing their own encryption program, and more ordinary citizens will install malicious apps from unofficial app stores.

[–] [email protected] 23 points 11 months ago (1 children)

And everyone else will have their shit dumped out in the open when ai starts breaking through all the back doors and manipulating officials into clearing them

[–] [email protected] 7 points 11 months ago

You'll never decode me and my brother's Pig Latin

load more comments (1 replies)
[–] [email protected] 80 points 11 months ago (8 children)

I have helped a little with some ongoing research on the subject of client-side-scanning in a European research center. Only some low level stuff, but I possess a solid background in IT security and I can explain a little what the proposition made to the EU is. I am by no means condemning what is proposed here.I myself based on what experts have explained am against the whole idea because of the slippery slope it creates for authoritarian government and how easily it can be abused.

The idea is to use perceptual hashing to create a local or remote database of known abuse material (Basically creating an approximation of already known CP content and hashing it) and then comparing all images accessible to the messaging app against this database by using the same perceptual hashing process on them.

It's called Client-Side-Scanning because of the fact that it's simply circumventing the encryption process. Circumvention in this case means that the process happens outside of the communication protocol, either before or after the images, media, etc, are sent. It does not matter that you use end-to-end encryption if the scanning is happening on you data at rest on your device and not in transit. In this sense it wouldn't directly have an adverse effect on end-to-end encryption.

Some of the most obvious issues with this idea, outside of the blatant privacy violation are:

  1. Performance: how big is the database going to get? Do we ever stop including stuff?
  2. Ethical: Who is responsible for including hashes in the database? Once a hash is in there it's probably impossible to tell what it represent, this can obviously be abused by unscrupulous governments.
  3. Personal: There is heavy social stigma associated with CP and child abuse. Because of how they work, perceptual hashes are going to create false positives. How are these false positives going to be addressed by the authorities? Because when the police come knocking on your door looking for CP, your neighbors might not care or understand that it was a false positive.
  4. False positives: the false positive rate for single hashes is going to stay roughly the same but the bigger the database gets the more false positive there is going to be. This will quickly lead to problems managing false positive.
  5. Authorities: Local Authorities are generally stretcht thin and have limited resources. Who is going to deal with the influx of reports coming from this system?
[–] [email protected] 18 points 11 months ago (1 children)

This is a really nice summary of the practical issues surrounding this.

There is one more that I would like to call out: how does this client scanning code end up running in your phone? i.e. who pushes it there and keeps it up to date (and by consequence the database).

I can think of a few options:

  1. The messaging app owner includes this as part of their code, and for every msg/image/etc checks before send (/receive?)
  2. The phone OS vendor puts it there, bakes it as part of the image store/retrieval API - in a sense it works more on your gallery than your messaging app
  3. The phone vendor puts it there, just like they already do for their branded apps.
  4. Your mobile operator puts it there, just like they already do for their stuff

Each of these has its own problems/challenges. How to compel them to insert this (ahem "backdoor"), and the different risks with each of them.

[–] [email protected] 12 points 11 months ago

Another problem: legislation like this cements the status quo. It's easy enough for large incumbents to add features like this, but to a handful of programmers trying to launch an app from their garage, this adds another hurdle into the process. Remember: Signal and Telegram are only about a decade old, we've seen new (and better) apps launch recently. Is that going to stop?

It's easy to say "this is just a simple hash lookup, it's not that big a deal!", but (1) it opens the door to client-side requirements in legislation, it's unlikely to stop here, (2) if other countries follow suit, devs will need to implement a bunch of geo-dependant (?) lookups, and (3) someone is going to have to monitor compliance, and make sure images are actually being verified--which also opens small companies up to difficult legal actions. How do you prove your client is complying? How can you monitor to make sure it's working without violating user privacy?

Also: doesn't this close the door on open software? How can you allow users to install open source message apps, or (if the lookup is OS-level) Linux or a free version of Android that they're able to build themselves? If they can, what's to stop pedophiles from just doing that--and disabling the checks?

If you don't ban user-modifiable software on phones, you've just added an extra hurdle for creeps: they just need to install a new version. If you do, you've handed total control of phones to corporations, and especially big established corporations.

load more comments (7 replies)
[–] [email protected] 53 points 11 months ago (4 children)

What is wrong with the eu? Why do they need to always ban end to end encryption?

[–] [email protected] 8 points 11 months ago (2 children)
load more comments (2 replies)
[–] [email protected] 7 points 11 months ago (2 children)

As I remember at the moment partly Von Der Leyen, the current Commission president. She is a German Christian democrat and apparently bit with capital C. Meaning she has bit of a moral panic streak on her of the "won't you think of the children" variety. As I understand this current proposal is very much driven by her.

However her driving it doesn't mean it sail through to pass as legislation. Some whole memberstate governments are against the encryption busting idea.

[–] [email protected] 5 points 11 months ago

And the fact that Ylva Johansson, being technologically illiterate as well as a close bed buddy with companies in the surveillance industry that stand to earn a crap load of money doesn't help...

load more comments (1 replies)
[–] [email protected] 3 points 11 months ago (1 children)

5 eyes. Politicians are puppets.

[–] [email protected] 3 points 11 months ago

Wait, you have a choice to vote for either puppet 1, puppet 2, or puppet 3. Your choice matters! .. as long as the politicians podiums are provided by the rich we don't have a real say.

[–] [email protected] 3 points 11 months ago

I'm sure they will tell you it's weighing the security (against terrorists, criminals, etc) of the many against the security (from seeing dick pics or messaging a mistress) of the few.

[–] [email protected] 42 points 11 months ago (1 children)

And the EU is wondering why they're having an image problem

load more comments (1 replies)
[–] [email protected] 41 points 11 months ago (1 children)

Making it illegal only hampers those that follow the law.

Criminals, by definition, already don't follow the law.

[–] [email protected] 10 points 11 months ago

Exactly. When privacy is criminal, only criminals will have privacy.

[–] [email protected] 38 points 11 months ago (1 children)

People in Reddit and sometimes here always praise the EU as some bastion of privacy, and I always got downvoted when I said that this isn't always true. And now here we are. I hope people don't forget this after a month, like they always do.

[–] [email protected] 8 points 11 months ago

They will, and you're screaming into the wind sadly.

What you can do is never forget and base your voting decisions to include this as a priority going forward. Endorse and support companies that protect privacy.

It's a long uphill battle and every little thing can help no matter how small.

[–] [email protected] 28 points 11 months ago (14 children)

citizens have the right to private communication.

load more comments (14 replies)
[–] [email protected] 28 points 11 months ago (2 children)
load more comments (2 replies)
[–] [email protected] 26 points 11 months ago (1 children)

While this would be terrible if it passes, a part of me hopes a silver lining would be a massive surge in open source development focusing on privacy respecting software that does not follow or enable this disgusting behavior by the eu

[–] [email protected] 17 points 11 months ago (1 children)

Software which may be made illegal.

[–] [email protected] 9 points 11 months ago (5 children)

How would such a ban ever be enforceable?

[–] [email protected] 10 points 11 months ago (3 children)

If you are using Windows or mac, they will be first in line to implement "protection" against "insecure software" :)

[–] [email protected] 7 points 11 months ago (2 children)

Or Android with Google Play. It already does this BS, even if you disable scanning.

Lineage/Graphene/DivestOS here I come.

load more comments (2 replies)
[–] [email protected] 3 points 11 months ago

That's like already the thing, if I modify my little "secure encrypted sharing protocol" windows will flag it as a virus if I send it (the software) to someone ... Then after some time it's ok. Guess it's about those antivirus heuristics, but it's not like it's not putting a big stick in my bicycle wheel.

load more comments (1 replies)
[–] [email protected] 5 points 11 months ago

Ostensibly via TCPA.

load more comments (3 replies)
[–] [email protected] 19 points 11 months ago

Just imagine the headline we'd see in the west if this was happening in China.

[–] [email protected] 17 points 11 months ago (1 children)

This is almost definitely not going through the ECJ. If they pass this directive I'm gonna take my chances.

Thanks to the Matrix protocol there is no chance of getting rid of E2EE communication anyway. There is no feasible way to stop decentralized communication like that, no without killing the internet.

[–] [email protected] 5 points 11 months ago

Also I would add, not like this is unanimously supported in EU among memberstates. So this isn't a done deal, this is a legislative proposal. Ofcourse everyone should activate and campaign on this, but its not like this is "Privacy activists vs all of EU and all the member state governments" situation. Some official government positions on this one are "this should not pass like it is, breaking the encryption is bad idea".

Wouldn't be first time EU commission proposal falls. Plus as you said ECJ would most likely rule it as being against the Charter of Rights of European Union as too wide breach of right to privacy.

[–] [email protected] 14 points 11 months ago

If apps would turn off e2e encryption, how would it be? Would it affect bordering regions? Users of VPNs inside EU?

My country proposed a ban on VPN software (targeting appstores providibg them), it can also target messengers. If I get a EU version of this app, or if I use a european VPN to connect via it, would I be less safe sending political memes?

[–] [email protected] 10 points 11 months ago

@makeasnek
Imagine a data leak in such a situation.

[–] [email protected] 10 points 11 months ago (1 children)

I wonder if openPGP will ever gain popularity.

The only ones I have seen that even publish a key for me to use are a few famous internet individuals (people like Richard stallman, (I don't know if he specifically uses it)), a few companies like mullvad, a few orgs like EFF, whistleblowers, and a few governmental organisations like the Financial Supervisory Authority in my country.

[–] [email protected] 5 points 11 months ago* (last edited 11 months ago)

@lud @makeasnek With more government controls and intervention, its possible. I learned how to use PGP pretty efficiently but there is absolutely no one in my daily life that also uses it.

Manual encryption with personal keys may become the norm if less and less services are able to use it.

[–] [email protected] 8 points 11 months ago (1 children)

If they can scan it, they can edit it.

load more comments (1 replies)
[–] [email protected] 6 points 11 months ago (3 children)

Can this be circumvented somehow? And how would apps with end to end encryption work if a person in a non-EU state spoke to someone inside the EU?

[–] [email protected] 9 points 11 months ago (2 children)

Anything can be circumvented, VPNs, Tor, I2P, and some other more unknown apps like briar. The issue will become will using those services become illegal too, and the barrier of entry becoming too high for those outside of the technical world. Signal will definitely just pull support for the EU, so you'll have to trick it into thinking you're not in the EU. But now you're at risk of running a foul with the law.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 5 points 11 months ago (1 children)

Would a way to legally bypass this be an app that can "encrypt" your text before your send it. The government would be able to see all of your messages but it would be scrambled in a way that they couldn't read it.

Something where both people would install the same text scrambling app and generate the same key to scramble all text (would need to do in person). They would then type all their text into the app and it would scramble it. The user would then copy The Scrambled text and send it over any messaging platform they want. The recipient would need to copy the text and put it back into the scrambling app to descramble it.

[–] [email protected] 4 points 11 months ago

This is how PGP works and is pretty widely used. https://en.m.wikipedia.org/wiki/Pretty_Good_Privacy

[–] [email protected] 5 points 11 months ago

Although some US corporations such as Meta are already scanning European messages for previously classified CSAM ‚only‘

This is news to me, does anyone have any more detail?

[–] [email protected] 4 points 11 months ago (1 children)

I wonder if projects like Signal could make a community run and certified hash database that could be included in Signal et al without threat of governments and self-interested actors putting malicious entries in. It definitely doesn't solve every problem with the client side scanning, but it does solve some.

But... an open, verifiable database of CSAM hashes has its own serious problems :-S Maybe an open, audited AI tool that in turn makes the database? Perhaps there's some clever trick to make it verifiable that all the hashes are for CSAM without requiring extra people to audit the CSAM itself.

[–] [email protected] 4 points 11 months ago (1 children)

You're unfortunately also handing people distributing csam a way to verify whether their content would be detected by checking it against the database

load more comments (1 replies)
[–] [email protected] 3 points 11 months ago

Yeah... Because hampering legal encryption will totally hamper all those who just continue to use the methods we have today.

load more comments
view more: next ›