this post was submitted on 08 Jun 2024
364 points (98.7% liked)

Technology

58303 readers
19 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 5C5C5C 18 points 5 months ago (2 children)

Yeah that's my big takeaway here: If the people who are rolling out this technology cannot make these assurances then the technology has no right to exist.

[–] [email protected] 1 points 5 months ago (1 children)

Show me a computer that can only run benign programs.

[–] 5C5C5C 2 points 5 months ago (1 children)

A computer will run whatever software you put on it. As long we're putting benign software on our computers, the computer will be benign.

If you knowingly put criminal software on a computer then you are committing a crime. If someone tricks you into putting criminal software onto a computer then the person who tricked you is committing a crime.

If you are developing software and can't be sure whether that the software you're developing will commit crimes, then you are guilty of a criminal level of negligence.

[–] [email protected] 1 points 5 months ago (1 children)

Nah, if the computer manufacturer can't stop you from running evil software, the technology has no right to exist. Demand these assurances!

[–] 5C5C5C 1 points 5 months ago* (last edited 5 months ago) (1 children)

You're being pretty dense if you can't wrap your head around a basic concept of accountability.

A human can choose to commit crimes with any product, including .. I don't know .. a fork. You could choose to stab someone with a fork, and you'd be a criminal. We wouldn't blame the fork manufacturer for that because the person who chose for a crime to be committed was the person holding the fork. That's who's accountable.

But if a fork manufacturer starts selling forks which might start stabbing people on their own, without any human user intending for the stabbing to take place, then the manufacturer who produced and sold the auto-stabbing forks is absolutely guilty of criminal negligence.

Edit: But I'll concede that a law against the technology being used to assist humans in criminal activity in a broad sense is unrealistic. At best there would need to be bounds around the degree of criminal help that the tool is able to provide.

[–] [email protected] 1 points 5 months ago

But a human asking how to make a bomb is somehow the LLM's fault.

Or the LLM has to know that you are who you say you are, to prevent you from writing scam e-mails.

The guy you initially replied to was talking about hooking up an LLM to a virus replication machine. Is that the level of safety you're asking for? A machine so safe, we can give it to supervillains?