this post was submitted on 26 Sep 2023
13 points (100.0% liked)

privacy

363 readers
1 users here now

Rules (WIP)

  1. No ad hominem allowed
  2. Attack the idea, not the poster

founded 1 year ago
MODERATORS
 

Cloudflare-free link for Tor/Tails users: https://web.archive.org/web/20230926042518/https://balkaninsight.com/2023/09/25/who-benefits-inside-the-eus-fight-over-scanning-for-child-sex-content/

It would introduce a complex legal architecture reliant on AI tools for detecting images, videos and speech – so-called ‘client-side scanning’ – containing sexual abuse against minors and attempts to groom children.

If the regulation undermines encryption, it risks introducing new vulnerabilities, critics argue. “Who will benefit from the legislation?” Gerkens asked. “Not the children.”

Groups like Thorn use everything they can to put this legislation forward, not just because they feel that this is the way forward to combat child sexual abuse, but also because they have a commercial interest in doing so.

they are self-interested in promoting child exploitation as a problem that happens “online,” and then proposing quick (and profitable) technical solutions as a remedy to what is in reality a deep social and cultural problem. (…) I don’t think governments understand just how expensive and fallible these systems are

the regulation has […] been met with alarm from privacy advocates and tech specialists who say it will unleash a massive new surveillance system and threaten the use of end-to-end encryption, currently the ultimate way to secure digital communications

A Dutch government official, speaking on condition of anonymity, said: “The Netherlands has serious concerns with regard to the current proposals to detect unknown CSAM and address grooming, as current technologies lead to a high number of false positives.” “The resulting infringement of fundamental rights is not proportionate.”

top 3 comments
sorted by: hot top controversial new old
[–] [email protected] 5 points 1 year ago (1 children)

That's the thing. CSAM filtering can be useful when images are posted to public websites. But scanning private files, no matter what the scan is for, is a major privacy concern.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

CSAM is one thing, like pictures? But how can they detect “attempts to groom children”? If someone says on Mastodon that they’re sad at school, and if I commented saying something nice trying to help them feel better, would that be a potential grooming? AI will read & analyze every private message like that? And for better AI “predictions”, every user is required to verify their age, sex, sexual orientation, hobbies, etc?

Btw Happy Birthday GNU! 🎂

[–] [email protected] 0 points 1 year ago

So they were able to shut down the whole world because of the flu but they cannot catch and arrest child abusers and need to scan all private communications, sounds legit.