this post was submitted on 20 Nov 2023
618 points (94.4% liked)

Technology

58303 readers
3 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 1 year ago

It's not AGI that's terrifying, but how people are so willing to let anything take over their control. LLMs are "just" predictive text generation with a lot of extras to make things come out really convincing sometimes, and yet so many individuals and companies basically handed over the keys without even second guessing its answers.

These past few years have shown how if (and it's a big if) AGI/ASI comes along, we are so screwed, because we can't even handle dumber tools well. LLMs in the hands of willing idiots can be a disaster itself, and it's possible we're already there.