this post was submitted on 20 Nov 2023
109 points (100.0% liked)

technology

22683 readers
1 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 22 points 11 months ago* (last edited 11 months ago) (2 children)

If I understand the conflict in OpenAI correctly, it's a schism between the folks who actually believe in the Skynet threat (lead by chief scientist Ilya Sutskever) and those who (correctly) understand the Skynet fears are a marketing tool (Altman and Microsoft).

I always knew that Microsoft was going to cannibalize OpenAI, but I assumed it was going to be via taking over the infrastructure/IP and booting Altman. It looks like it's the other way around, with them cannibalizing OpenAI's staff: Hundreds of OpenAI employees threaten to resign and join Microsoft. From this article, OpenAI has 700 employees, and basically all of them are threatening to join Microsoft.

[–] [email protected] 4 points 11 months ago* (last edited 11 months ago) (1 children)

If I understand the conflict in OpenAI correctly, it's a schism between the folks who actually believe in the Skynet threat (lead by chief scientist Ilya Sutskever)

I'm personally quite glad that the scientists have a serious perspective on this. At the very least it means they might withhold their labour if they deem it unsafe at any time.

It's far too early for it to be unsafe and you're correct that it's marketing at the moment, but it's still good that they take it so seriously they're willing to break companies over it. It bodes well that they conflict so hard early on for when things get into actual dangerous territory.

[–] [email protected] 5 points 11 months ago

The problem though is that this is pure idealism on the part of the scientists. There’s no way anything approaching AGI can be kept under wraps by some scientists no matter how benevolent they consider themselves. And given our current economic structure once that cat is out of the bag it’ll be hell for the rest of us.

I’ve seen a lot of talk on the orange site about AI doomerism. I’m not doomer about AI I’m doomer about our society being the wrong structure to handle it.

[–] [email protected] 1 points 11 months ago

On one side we have eugenicist doomsday cultists and on the other side you have just your normal opportunist techbro (who's also a eugenicist).