this post was submitted on 28 Aug 2024
220 points (91.7% liked)
Technology
60098 readers
1845 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Telegram: "Man, fuck them kids bruh!"
Those programs are about mass surveillance and are wrapping themselves in the sheep wool of "protecting kids"
Doesn't mean they shouldn't moderate.
Why should they? Should every mail(physical or not) you receive be opened and read? Should the government have access to everything you do on your phone or pc? Should the government moderate your house? You are full 1984.
Even Facebook doesn't allow CSAM in public profiles. You can't just pull up Facebook and see that on your regular feed. Closed groups are a different story. Why should this be different?
Mind you I'm not saying that the CEO should be criminally responsible for what users on the platform post. I'm pointing out that moderation is a thing even on some of the worst offenders in the space.
You didn't answer my questions.
What moderation do you want? And how would you prevent "moderation" from becoming censorship?
Aren't there people whose job is to prevent crimes? Why some IT person who has no idea of crime need to do their job?
Because your questions aren't germane to the point I was making. In fact the first question "how would you prevent " moderation" from becoming censorship" is literally answered by my second comment. Facebook already does this with Facebook messenger. But even if they didn't, Signal has functions to allow encryption.
So what you're saying is, criminals who aren't using encryption (on a platform where encryption features are readily available) don't deserve to be moderated on a platform where their messages are using a company's cloud bandwidth. Does the company not have rights? And if we agree that the company has rights then they also have to follow the law.
Yes there are people who's jobs are to (not prevent because police and policing is reactionary not preventative) investigate, and try criminals in a court of law for crimes). This was a poor question to ask. You're literally acting like we don't employ thousands of people over various social media and messaging platforms to review and moderate things like CSAM.
The gist for me is criminals gonna do criminal things but at the end of the day these are our public spaces and just because I don't want to be surveilled in public or live in a police state doesn't mean that I want criminals not to be prosecuted for crimes they commit just because someone cares more about their bottom line than they do about moderation of a messaging platform they provide to the public.
We aren't talking about end to end encrypted messages here. We're talking about messages with no such encryption that can be viewed by anyone. There are literally public groups being used by Terrorist organizations on Signal. And while Signal has repeatedly refused to give up encryption keys for the ones that are using encryption (as they should), any criminal that isn't is not protected by it and should be moderated.