this post was submitted on 01 Sep 2023
11 points (100.0% liked)

SneerClub

1012 readers
3 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

Does anyone here know what exactly happened to lesswrong to become so cult-y? I had never seen or heard anything about it for years, back in my day it was seen as that funny website full strange people posting weird shit about utliltarianism, nothing cult-y, just weird. The aritcle on TREACLES and this sub's mentioning of lesswrong made me very curious about how it went from people talking out of their ass for the sheer fun of "thought experiments" to a straight-up doomsday cult?
The one time I read lesswrong was probably in 2008 or so.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago* (last edited 1 year ago) (1 children)

Only half joking: there was this one fanfic you see...

Mainly I don't think there was any one inciting incident beyond its creation: Yud was a one man cult way before LW, and the sequences actively pushed all the cultish elements required to lose touch with reality. (Fortunately, my dyslexic ass only got as far as the earlier bits he mostly stole from other people rather than the really crazy stuff.)

There was definitely a step-change around the time CFAR was created, that was basically a recruitment mechanism for the cult and part of the reason I got anywhere physically near those rubes myself. An organisation made to help people be more rational seemed like a great idea—except it literally became EY/MIRI's personal sockpuppet. They would get people together in these fancy ass mansions for their workshops and then tell them nothing other than AI research mattered. I think it was 2014/15 when they decided internally that CFAR's mission was to create more people like Yudkowsky. I don't think its a coincidence that most of the really crazy cult stuff I've heard about happened after then.

Not that bad stuff didn't happen before either.^___^

[–] [email protected] 8 points 1 year ago (1 children)

I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky

the real AI doom is Eliezer cloning facility

[–] [email protected] 10 points 1 year ago

Truer words were never spoken, probably.

CFAR is the mind killer (because they kill you and replace you with a Yud clone).