this post was submitted on 31 Aug 2023
596 points (97.9% liked)

Technology

58303 readers
11 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm rather curious to see how the EU's privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn't have a paywall)

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 28 points 1 year ago (1 children)

It's more like the law is saying you must draw seven red lines, all of them strictly perpendicular, some with green ink and some with transparent ink.

No, it's more like the law is saying you have to draw seven red lines and you're saying, "well I can't do that with indigo, because indigo creates purple ink, therefore the law must change!" No, you just can't use indigo. Find a different resource.

It's not "virtually" impossible, it's literally impossible. If the law requires that it be possible then it's the law that must change.

There's nothing that says AI has to exist in a form created from harvesting massive user data in a way that can't be reversed or retracted. It's not technically impossible to do that at all, we just haven't done it because it's inconvenient and more work.

The law sometimes makes things illegal because they should be illegal. It's not like you run around saying we need to change murder laws because you can't kill your annoying neighbor without going to prison.

Otherwise it's simply a more complicated way of banning AI entirely

No it's not, AI is way broader than this. There are tons of forms of AI besides forms that consume raw existing data. And there are ways you could harvest only data you could then "untrain", it's just more work.

Some things, like user privacy, are actually worth protecting.

[–] [email protected] 1 points 1 year ago (2 children)

There’s nothing that says AI has to exist in a form created from harvesting massive user data in a way that can’t be reversed or retracted. It’s not technically impossible to do that at all, we just haven’t done it because it’s inconvenient and more work.

What if you want to create a model that predicts, say, diseases or medical conditions? You have to train that on medical data or you can't train it at all. There's simply no way that such a model could be created without using private data. Are you suggesting that we simply not build models like that? What if they can save lives and massively reduce medical costs? Should we scrap a massively expensive and successful medical AI model just because one person whose data was used in training wants their data removed?

[–] [email protected] 1 points 1 year ago (1 children)

I guarantee the person you're arguing with would rather see people die than let an AI help them and be proven wrong.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Well then you'd be wrong. What a fucking fried and delusional take. The fuck is wrong with you?

[–] [email protected] 1 points 1 year ago

This is an entirely different context - most of the talk here is about LLMs, health data is entirely different, health regulations and legalities are entirely different, people don't publicly post their health data to begin with, health data isn't obtained without consent and already has tons of red tape around it. It would be much easier to obtain "well sourced" medical data than thebroad swaths of stuff LLMs are sifting through.

But the point still stands - if you want to train a model on private data, there are different ways to do it.