this post was submitted on 08 Jun 2024
364 points (98.7% liked)
Technology
63009 readers
3471 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But a human asking how to make a bomb is somehow the LLM's fault.
Or the LLM has to know that you are who you say you are, to prevent you from writing scam e-mails.
The guy you initially replied to was talking about hooking up an LLM to a virus replication machine. Is that the level of safety you're asking for? A machine so safe, we can give it to supervillains?