this post was submitted on 25 Jul 2023
13 points (100.0% liked)

Fediverse

42 readers
5 users here now

This magazine is dedicated to discussions on the federated social networking ecosystem, which includes decentralized and open-source social media platforms. Whether you are a user, developer, or simply interested in the concept of decentralized social media, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as the benefits and challenges of decentralized social media, new and existing federated platforms, and more. From the latest developments and trends to ethical considerations and the future of federated social media, this category covers a wide range of topics related to the Fediverse.

founded 2 years ago
 

This is entirely the fault of the IWF and Microsoft, who create "exclusive" proprietary CSAM prevention software and then only license big tech companies to use it.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 12 points 1 year ago* (last edited 1 year ago) (1 children)

Publishing a list of hashes would make it trivial for abusers to know when their images are being flagged. It would be better to get M$ to do the scanning work themselves

[โ€“] [email protected] 6 points 1 year ago

Bingo. It would also make it trivial to alter images just enough so that it wouldn't match the hash, and then they can post shit that would need to be manually flagged and removed.

I already see things like this with pirated media; pirates will include extraneous material bundled with the target media so that it's not automatically flagged and removed.