this post was submitted on 22 Sep 2023
52 points (91.9% liked)

Technology

58303 readers
6 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The organizers of a high-profile open letter last March calling for a "pause" in work on advanced artificial intelligence lost that battle, but they could be winning a longer-term fight to persuade the world to slow AI down.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago

Yea, exactly. And to further expand, everyone should focus on their work, the work they're trained for and good at. The people that are trained for and good at exploring potential fallout are lawyers, philosophers, historians and doctors I suppose. Probably missed a few.

These are different folks from the people building the actual things. They're specialized in building things, not exploring potential ramifications. It's a different skillset, and while not mutually exclusive, they're certainly distinct from each other. Having one does not come with the other.

This is why its not zero-sum. The people deciding what is right and wrong to build (with laws) and the people doing the building are not and should not be the same people. Since the "teams" are different, the work of one does not need to slow another. Nor should we really slow, because we have heavy international competition in this field and frankly cannot afford to fall behind in capability. That would almost certainly create an even greater risk than blundering ahead, since other people would just blunder ahead without us. That gets us nothing.

We as citizens have work in this field as well, to discuss these things around water coolers, dinner tables and forums, and in articles, books and conferences, to decide how we ourselves feel about the issue and how it'll affect our fields. Shits changing fast.