this post was submitted on 31 Aug 2023
596 points (97.9% liked)
Technology
58303 readers
3 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You seem to have the assumption that they're not. And that "helping society" is anything more than a happy accident that results from "making big profits".
A pretty big "what if" when every single model that's been tried for the purpose you suggest so far has either predicted based off the age of a medical imaging scan, or off the doctor's signature in the corner of one.
Are you asking me whether it's a good idea to give up the concept of "Privacy" in return for an image classifier that detects how much film grain there is in a given image?
It's not an assumption. There's academic researchers at universities working on developing these kinds of models as we speak.
I'm not wasting time responding to straw men.
Where does the funding for these models come from? Why are they willing to fund those models? And in comparison, why does so little funding go towards research into how to make neural networks more privacy-compatible?