this post was submitted on 30 Nov 2024
31 points (100.0% liked)

SneerClub

1012 readers
6 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

https://nonesense.substack.com/p/lesswrong-house-style

Given that they are imbeciles given, occasionally, to dangerous ideas, I think it’s worth taking a moment now and then to beat them up. This is another such moment.

top 39 comments
sorted by: hot top controversial new old
[–] [email protected] 19 points 3 weeks ago (2 children)

This is obviously insane, the correct conclusion is that learning models cannot in fact be trained so hard that they will always get the next token correct. This is provable, and it’s not even hard to prove. It’s intuitively obvious, and a burly argument that backs the intuition is easy to build.

You do, however, have to approach it through analogies, through toy models. When you insist on thinking about the whole thing at once, you wind up essentially just saying things that feel right, things that are appealing. You can’t actually reason about the damned thing at all.

this goes a long way towards explaining why computer pseudoscience — like a fundamental ignorance of algorithmic efficiency and the implications of the halting problem — is so common and even celebrated among lesswrongers and other TESCREALs who should theoretically know better

[–] [email protected] 9 points 3 weeks ago* (last edited 3 weeks ago)

I am reminded of this excellent essay that I saved a while back: ""your brain does not process information and it is not a computer"

[–] [email protected] 4 points 3 weeks ago (3 children)

I'm out of the loop: what is lesswrong and why is it cringe?

[–] [email protected] 16 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

It's complicated.

It's basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

They are also communicating vessels with Effective Altruism.

If this piques your interest check the links on the sidecard.

[–] [email protected] 4 points 3 weeks ago (1 children)

They are also communicating vessels with Effective Altruism.

I have a basic understanding of what EA is but what do you mean by communicating vessels?

[–] [email protected] 12 points 3 weeks ago* (last edited 3 weeks ago)

EA started as an offshoot of LessWrong, and LW-style rationalism is still the main gateway into EA as it's pushed relentlessly in those circles, and EA contributes vast amounts of money back into LW goals. Air strikes against datacenters guy is basically bankrolled by Effective Altruism and is also the reason EA considers magic AIs (so called Artificial Super Intelligences) by far the most important risk to humanity's existence; they consider climate change mostly survivable and thus of far less importance, for instance.

Needless to say, LLM peddlers loved that (when they aren't already LW/EAs or adjacent themselves, like the previous OpenAI administrative board before Altman and Microsoft took over). edit: also the founders of Anthropic.

Basically you can't discuss one without referencing the other.

[–] [email protected] 13 points 3 weeks ago (3 children)

Rationalwiki (not affiliated with LW Rationalists, the opposite actually, op is a mod there) has a page on it. https://rationalwiki.org/wiki/Less_wrong

[–] [email protected] 7 points 3 weeks ago (3 children)

That sounds like a religion insisting it isn’t one

[–] [email protected] 10 points 3 weeks ago (1 children)

I think it is a little bit more complicated, Im one of the few mentioning this however, so it isnt a common idea I think. I think it isnt directly a cult/religion, but stealing the language of Silicon Valley, it is a cult incubator. Reading these things, having these beliefs about AGI and rationality makes you more susceptible to join or start cult like groups. The less wrong article "every cause wants to be a cult" doesnt help for example, neither does it when they speak highly of the methods os scientology. The various spinoffs and how many of these groups act cultlike and use cultlike shit makes me think this.

So it is worse in a way.

[–] [email protected] 11 points 3 weeks ago (1 children)

There's also the communal living, the workplace polyamory along with the prominence of the consensual non-consensual kink, the tithing of the bulk of your earnings and the extreme goals-justify-the-means moralising, the emphasis on psychedelics and prescription amphetamines, and so on and so forth.

Meaning, while calling them a cult incubator is actually really insightful and well put, I have a feeling that the closer you get to TESCREAL epicenters like the SFB the more explicitly culty things start to get.

[–] [email protected] 9 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Yeah but tescreal is a name we give them, themselves organise in different groups (which fit into the term yes). They have different parts pf the tescreal, but it all ends up in culty behaviour, just a different cult.

Btw see also love bombing with Quantum Scott. There was also the weird LW people who ended up protesting other LW people in the crazy way (didnt it include robes or something, I dont recall much). Or calling Scottstar the rightful caliph when Yud was posting less.

So my point is more they morph into different cults, and wonder how much they use this lack of singular cult as a way to claim they are not a cult. Or whatever rot13ed word they used for cult.

E: not that all this really matters in the grand scheme of things. just a personal hangup.

[–] [email protected] 12 points 3 weeks ago* (last edited 3 weeks ago)

whatever rot13ed word they used for cult.

It's impossible to read a post here without going down some weird internet rabbit hole isn't it? This is totally off topic but I was reading the comments on this old phyg post, and one of the comments said (seemingly seriously):

It's true that lots of Utilitarianisms have corner cases where they support action that would normally considered awful. But most of them involve highly hypothetical scenarios that seldom happen, such as convicting an innocent man to please a mob.

And I'm just thinking, riight highly hypothetical.

[–] [email protected] 7 points 3 weeks ago (1 children)

It is a peculiar sort of faith movement, where the central devotional practice is wandering around pulling made-up probability estimates out of one's ass

[–] [email protected] 4 points 3 weeks ago

and then posting walls of text about them not merely burying the lede but quite fully conspiring to eliminate the evidence and all witnesses in the same go, as a starting condition

[–] [email protected] 6 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

They do seem to worship Bayes

Edit: I want to qualify that I'm a big fan of Bayes Theorem — in my field, there's some awesome stuff being done with Bayesian models that would be impossible to do with frequentist statistics. Any scorn in my comment is directed at the religious fervour that LW directs at Bayesian statistics, not at the stats themselves.

I say this to emphasise that LWers aren't cringe for being super enthusiastic about maths. It's the everything else that makes them cringe

[–] [email protected] 10 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

The particular way they invoke Bayes' theorem is fascinating. They don't seem to ever actually use it in any sort of rigorous way, it's merely used as a way to codify their own biases. It's an alibi for putting a precise percentage point on your vibes. It's kind of beautiful in a really stupid sort of way.

[–] [email protected] 10 points 2 weeks ago (1 children)

They take a theory that is supposed to be about updating one's beliefs in the face of new evidence, and they use it as an excuse to never change what they think.

[–] [email protected] 4 points 2 weeks ago

It's the Bayesian version of Zeno's paradox. Before one can update their beliefs, one must have evidence of an alternative proposition. But no one piece of evidence is worth meaningfully changing your worldview and actions. In order to be so it would need to be supported. But then that supporting evidence would itself need to be supported. And so on ad infinitum.

[–] [email protected] 6 points 2 weeks ago (2 children)

They seem to believe that stereotypes often have a grain of truth to them, and it's thus ok to believe stereotypes.

[–] [email protected] 9 points 2 weeks ago

"which stereotypes?"
"oh, you know the ones"

[–] [email protected] 6 points 2 weeks ago (1 children)

I would say it goes further and that they have a (pseudo?)magical trust in their own intuitions, as if they are crystal clear revalations from the platonic realms.

[–] [email protected] 6 points 2 weeks ago (1 children)

I will always remember Sam Bankman Fried saying it's obvious that Shakespeare can't be the greatest author ever because it's unlikely. Just because something's unlikely doesn't mean it's impossible! You need to independently evaluate the evidence!

[–] [email protected] 3 points 2 weeks ago

Also I feel like the logic he based that on was just dumb. Like, some writer out of the last several centuries is going to be the best for whatever given metric. We shouldn't be surprised that any particular individual is the best any more than another. If anything the fact that people still talk about him after the centuries is probably the strongest argument in favor of his writing that you could make.

But of course Sam's real goal was to justify the weird rationalist talking point that reading is overrated because podcasts exist or something.

[–] [email protected] 5 points 3 weeks ago (1 children)

Ok rationalwiki actually seems like a really useful resource for reading up on which sexy new movements are bullshit and which aren't

[–] [email protected] 6 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

It is, but I would say that as it is aligned to what I think about these folks. It is also a funny site in a way that a lot of these weirdos go "rational wiki sucks, is not rational and lies!" Before reading the pages they are mad about, and afterwards go "yeah no that is fair" after reading it. Happend quite a few times with the "skeptic" yt people in the yt'er to alt right funnel/pipeline from a decade ago. (A few of these people have really lost the plot now, armored skeptic is now some believer in aliens for example. I dont think anyone has cared enough about him to update his page however).

[–] [email protected] 8 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

RationalWiki really hits that sweetspot where everybody hates it and you know that means it's doing something right:

From Prolewiki:

RationalWiki is an online encyclopedia created in 2007. Although it was created to debunk Conservapedia and Christian fundamentalism,[1] it is also very liberal and promotes anti-communist propaganda. It spreads imperialist lies and about socialist states including the USSR[2] and Korea[3] while uncritically promoting narratives from the CIA and U.S. State Department.

From Conservapedia:

RationalWiki.org is largely a pro-SJW atheists website.

[ . . . ]

RationalWikians have become very angry and have displayed such behavior as using profanity and angrily typing in all cap letters when their ideas are questioned by others and/or concern trolls (see: Atheism and intolerance and Atheism and anger and Atheism and dogmatism and Atheism and profanity).[33]

From WikiSpooks (with RationalWiki's invitation for anyone to collaborate highlighted with an emotionally vulnerable red box for emphasis):

Although inviting readers to "register and engage in constructive dialogue", RationalWiki appears not to welcome essays critical of RationalWiki[3] or of certain official narratives. For example, it is dismissive of the Journal of 9/11 Studies, terming it, as of 2017, it a "peer- crank-reviewed, online, open source pseudojournal".[4]

And a little bonus:

"Can I have Google discount my rationalwiki entry, has errors posted out of spite 10 years ago"

https://support.google.com/websearch/thread/106033064/can-i-have-google-discount-my-rationalwiki-entry-has-errors-posted-out-of-spite-10-years-ago?hl=en

My site questions Darwinism but that's become quite mainstream. But my rationalwiki page has over 20 references to me being a creationist, and is tagged "pseudoscience." Untrue

[–] [email protected] 4 points 3 weeks ago* (last edited 2 weeks ago)

Perfect.

Damn librals!

E: Saying Darwinism when you mean evolution is quite something btw. Ow god he also is ancient, from 1939.

[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Happend quite a few times with the “skeptic” yt people in the yt’er to alt right funnel/pipeline from a decade ago. (A few of these people have really lost the plot now

I would love a separate thread on this, more generally a “late 2000s/early 2010s skeptic ytbuers, where are they now”?. The only example (sorta*) I have is thunderf00t, whose yt career track is: anti-christianity, anti-anita sarkeesian, and now anti musk.

*he is not alt right, at least by any mainstream definition of alt-right, afaict.

[–] [email protected] 5 points 2 weeks ago

@swlabr

"anti-christianity, anti-anita sarkeesian, and now anti musk."

2 out of 3 ain't bad.

[–] [email protected] 5 points 2 weeks ago* (last edited 2 weeks ago)

I would love a separate thread on this, more generally a “late 2000s/early 2010s skeptic ytbuers, where are they now”?

Personally I'm not going to waste much time on it, every time I see somebody post/make a vid about one of the older people it gets really sad and weird. Shadiversity (while not a skeptic) turned into a big weird (or well, went mask off), stuff like that.

E: I'm also not sure if YT is even still big, income wise, for people or if people go more to twitch for livestreaming shit and then double dipping by uploading edited streams to yt.

[–] [email protected] 4 points 3 weeks ago
[–] [email protected] 10 points 3 weeks ago

They're Basically fanboys of whatever the latest cult is coming out of silicon valley.

[–] [email protected] 15 points 3 weeks ago (2 children)

This is an interesting companion to that other essay castigating Rationalist prose, Elizabeth Sandifer's The Beigeness. The current LW style indulges in straight-up obscurantism and technobabble, which is probably better at hiding how dumb the underlying argument is and cloaking unsupported assertions as meaningful arguments. It also doesn't require you to be as widely-read as our favorite philosophy major turned psychiatrist turned cryptoreactionary, since you're not switching contexts every time it starts becoming apparent that you're arguing for something dumb and/or racist.

[–] [email protected] 8 points 3 weeks ago* (last edited 3 weeks ago)

This has always been the case. I think I first stumbled across less wrong in the early two thousands when I was a maths undergrad.

At this point it was mostly Eliezer writing extremely long blog posts about Bayesian thinking, and my take home was just, wow these guys are really bad at maths.

A good mathematician will carefully select the right level of abstraction to make what they're saying as simple as possible. Less wrong has always done the complete opposite, everything is full of junk details and needless complexity, in order to make it feel harder than it really is

Basically, Eliezer needs an editor, and everyone who copies his style needs one too.

[–] [email protected] 6 points 3 weeks ago

Oh, nice! I stumbled across this essay ages ago and misplaced it due to forgetting to bookmark it. Thanks for bringing it back to my attention.

It is quite a beautiful thing to see Scott Alexander's beige technobabble eviscerated by such vibrant and incisive prose.

[–] [email protected] 11 points 3 weeks ago (1 children)

Such a good post. LWers are either incapable of critical thought or self scrutiny, or are unwilling and think verbal diarrhea is a better choice.

[–] [email protected] 11 points 3 weeks ago (1 children)

It's an ironic tragedy that the average LWer claims to value critical thought far more than most people do, and this causes them to do themselves a disservice by sheltering in an echo chamber. Thinking of themselves as both smart and special helps them to make sense of the world and their relative powerlessness as an individual ("no, it's the children who are wrong" meme.jpeg). Their bloviating is how they main the illusion.

I feel comfortable speculating because in another world, I'd be one of them. I was a smart kid, and building my entire identity around that meant I grew into a cripplingly insecure adult. When I wrote, I would meander and over-hedge my position because I didn't feel confident in what I had to say; Post-graduate study was especially hard for me because it required finding what I had to say on a matter and backing myself on it. I'm still prone to waffling, but I'm working on it.

The LW excerpts that are critiqued in the OP are so sad to me because I can feel the potential of some interesting ideas beneath all the unnecessary technobabble. Unfortunately, we don't get to see that potential, because dressing up crude ideas for a performance isn't conducive to the kinds of discussions that help ideas grow.

[–] [email protected] 5 points 2 weeks ago (1 children)

In the Going Clear documentary an author says that because Scientology was built by and for L. Ron Hubbard, people who follow Scientology are gradually moulded in his image and pick up his worst traits and neuroses. LessWrong was founded by a former child prodigy.....

[–] [email protected] 6 points 2 weeks ago

...with a huge chip on his shoulder about how the system caters primarily to normies instead of specifically to him, thinks he has fat-no-matter-what genes and is really into rape play.