this post was submitted on 18 Jul 2023
536 points (97.5% liked)

Memes

45673 readers
875 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 

The picture is a screenshot of a job posting for a Killswitch Engineer at OpenAI, located in San Francisco, CA. The listed salary is $300,000 - $500,000 per year.

About the Role
Listen, we just need someone to stand by the servers all day and unplug them if this thing turns on us. You'll receive extensive training on "the code word" which we will shout if GPT goes off the deep end and starts overthrowing countries.

We expect you to:

  • Be patient.
  • Know how to unplug things. Bonus points if you can throw a bucket of water on the servers too. Just in case.
  • Be excited about OpenAI's approach to research.
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 15 points 1 year ago (3 children)

You get a job here and next thing you know, AI is targetting you.

[–] [email protected] 7 points 1 year ago (2 children)

The serious question is with what? I doubt the AI is going to kill you with disgusting pics or existential philosophy.

This means they hooked it up to something that might be used as a weapon of attack: an industial printer or a t-shirt cannon or a gunship at port.

[–] [email protected] 12 points 1 year ago (1 children)

The serious question is with what? I doubt the AI is going to kill you with disgusting pics or existential philosophy.

We live in digital world now. No need to actually, physically harm someone. Or maybe the AI will file a fake complaint against someone and cops will take care that individual.

[–] [email protected] 1 points 1 year ago

Actually the possibility of social engineering SWAT attacks on targets is a valid point. I noted some years ago that there are hospital devices that are now connected to the internet when they are in active use (such as those devices that administer medications intravenously based on timing and user input, and while such a set up could kill a patient by reprogramming the module, we've not yet an attack affect one yet.