Eliezer addressed this in part with his “Death Spiral” essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:
Having a house where core members live together.
Asking followers to completely adjust their thinking processes to include new essential concepts, terminologies, and so on to the lowest level of understanding reality.
Claiming that only if you carry out said mental adjustment can you really understand the most important parts of the organization’s philosophy.
Asking for money for a charity, particularly one which does not quite have the conventional goals of a charity, and claiming that one should really be donating a much larger percentage of one’s income than most people donate to charity.
Presenting an apocalyptic scenario including extreme bad and good possibilities, and claiming to be the best positioned to deal with it.
[Added] Demand you leave any (other) religion.
Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one’s approach just for appearances.
Practically all of it goes to them or their “associates”—by my reckoning. In 2009 some was burned on travel expenses and accomodation, some was invested - and some was stolen.
Who was actually helped? Countless billions in the distant future—supposedly.
What else should it go to? (Under the assumption that SI’s goals are positive.)
As Larks said above, they are doing thought work: they are not trying to ship vast quantities of food or medical supplies. The product of SI is the output from their researchers, the only way to get more output is to employ more people (modulo improving the output of the current researchers, but that is limited).
So, to recap, this is a proposed part of a list of ways in which the SIAI resembles a cult. It redistribtutes economic resources from the “rank and file” members up the internal heirarchy without much expenditure on outsiders—just like many cults do.
Keeping that in mind: SI has a problem because acting to avoid appearing to exist to give money to the upper ranks means that they can’t pay their researchers. There are three broad classes of solutions to this (that I can see):
Give staff little to no compensation for their work
Use tricky tactics to try to conceal how much money goes to the staff
Try to explain to everyone why such a large proportion of the money goes to the staff
Why was this downvoted instead of responded to? Downvoting people who are simply stating negative impressions of the group doesn’t improve impressions of the group.
Most organizations spend most of their money on staff. What else could you do with it? Paying fellowships for “external staff” is a possibility. But in general, good people are exactly what you need.
And yet there are plenty of things that don’t cost much money that they could be doing right now, that I have previously mentioned to SIAI staff and will not repeat (edit: in detail) because it might interfere with my own similar efforts in the near future.
Basically I’m referring to public outreach, bringing in more members of the academic community, making people aware that LW even exists (I wasn’t except when I randomly ran into a few LWers in person), etc.
What’s the reason for downvoting this? Please comment.
As I’ve discussed with several LWers in person, including some staff and visiting fellows, one of the things I disliked about LW/SIAI was that so much of the resources of the organization go to pay the staff. They seemingly wouldn’t even consider proposals to spend a few hundred dollars on other things because they claimed it was “too expensive”.
Associated with non-standard and non-monogamous sexual practices.
(Just some more pattern-matching on top of what you see in the parent and grandparent comment. I don’t actually think this is a strong positive indicator.)
Most, but not all. The Randians come to mind. Even the Buddha encouraged people to be critical, but doesn’t seem to have stopped the cults. I was floored to learn a few weeks ago that Buddhism has formalized even when you stop doubting! When you stop doubting, you become a Sotāpanna; a Sotāpanna is marked by abandoning ‘3 fetters’, the second fetter according to Wikipedia being
Skeptical Doubt—Doubt about the Buddha and his teaching is eradicated because the Sotāpanna personally experiences the true nature of reality through insight, and this insight confirms the accuracy of the Buddha’s teaching.
As well, as unquestioningness becomes a well known trait of cults, cults tend to try to hide it. Scientology hides the craziest dogmas until you’re well and hooked, for example.
Would it be productive to be skeptical about whether your login really starts with the letter “M”? Taking an issue off the table and saying, we’re done with that, is not in itself a bad sign. The only question is whether they really do know what they think they know.
I personally endorse the very beginning of Objectivist epistemology—I mean this: “Existence exists—and the act of grasping that statement implies two corollary axioms: that something exists which one perceives and that one exists possessing consciousness, consciousness being the faculty of perceiving that which exists.” It’s the subsequent development which is a mix of further gemlike insights, paths not taken, and errors or uncertainties that are papered over.
In the case of Buddhism, one has the usual problem of knowing, at this historical distance, exactly what psychological and logical content defined “enlightenment”. One of its paradoxes is that it sounds like the experience of a phenomenological truth, and yet the key realization is often presented as the discovery that there is no true self or substantial self. I would have thought that achieving reflective consciousness implied the existence of a reflector, just as in the Objectivist account. Then again, reflection can also produce awareness that traits with which you have identified yourself are conditioned and contingent, so it can dissolve a naive concept of self, and that sounds more like the Buddhism we hear about today. The coexistence of a persistent observing consciousness, and a stream of transient identifications, in certain respects is like Hinduism; though the Buddhists can strike back by saying that the observing consciousness is not eternal and free of causality, it too exists only if it has been caused to exist.
So claims to knowledge, and the existence of a stage where you no longer doubt that this really is knowledge, and get on with developing the implications, do not in themselves imply falsity. In a systematic philosophy based on reason, a description which covers Objectivism, Buddhism, and Less-Wrong-ism, there really ought to be some notion of a development that occurs as you as learn.
The alternative is Zen Rationalism: if you meet a belief on the road (of life), doubt it! It’s a good heuristic if you are beset by nonsense, and it even has a higher form in phenomenological or experiential rationalism, where you test the truth of a proposition about consciousness by seeing whether you can plausibly deny it, even as the experience is happening. But if you do this, even while you keep returning to beginner’s mind, you should still be dialectically growing your genuine knowledge about the nature of reality.
Eliezer addressed this in part with his “Death Spiral” essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:
Having a house where core members live together.
Asking followers to completely adjust their thinking processes to include new essential concepts, terminologies, and so on to the lowest level of understanding reality.
Claiming that only if you carry out said mental adjustment can you really understand the most important parts of the organization’s philosophy.
Asking for money for a charity, particularly one which does not quite have the conventional goals of a charity, and claiming that one should really be donating a much larger percentage of one’s income than most people donate to charity.
Presenting an apocalyptic scenario including extreme bad and good possibilities, and claiming to be the best positioned to deal with it.
[Added] Demand you leave any (other) religion.
Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one’s approach just for appearances.
Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.
Oh yes, and fact that the leader wants to SAVE THE WORLD.
About a third in 2009, the last year for which we have handy data.
Practically all of it goes to them or their “associates”—by my reckoning. In 2009 some was burned on travel expenses and accomodation, some was invested - and some was stolen.
Who was actually helped? Countless billions in the distant future—supposedly.
What else should it go to? (Under the assumption that SI’s goals are positive.)
As Larks said above, they are doing thought work: they are not trying to ship vast quantities of food or medical supplies. The product of SI is the output from their researchers, the only way to get more output is to employ more people (modulo improving the output of the current researchers, but that is limited).
So, to recap, this is a proposed part of a list of ways in which the SIAI resembles a cult. It redistribtutes economic resources from the “rank and file” members up the internal heirarchy without much expenditure on outsiders—just like many cults do.
(Eh. Yes, I think I lost track of that a bit.)
Keeping that in mind: SI has a problem because acting to avoid appearing to exist to give money to the upper ranks means that they can’t pay their researchers. There are three broad classes of solutions to this (that I can see):
Give staff little to no compensation for their work
Use tricky tactics to try to conceal how much money goes to the staff
Try to explain to everyone why such a large proportion of the money goes to the staff
All of those seem suboptimal.
Why was this downvoted instead of responded to? Downvoting people who are simply stating negative impressions of the group doesn’t improve impressions of the group.
Most organizations spend most of their money on staff. What else could you do with it? Paying fellowships for “external staff” is a possibility. But in general, good people are exactly what you need.
Often goods or needy beneficiaries are also involved. Charity actions are sometimes classified into:
Program Expenses
Administrative Expenses
Fundraising Expenses
This can be used as a heuristic for identifying good charities.
Not enough in category 1 and too much in categories 2 and 3 is often a bad sign.
But they’re not buying malaria nets, they’re doing thought-work. Do you expect to see an invoice for TDT?
Quite appart from the standard complaint about how awful a metric that is.
And yet there are plenty of things that don’t cost much money that they could be doing right now, that I have previously mentioned to SIAI staff and will not repeat (edit: in detail) because it might interfere with my own similar efforts in the near future.
Basically I’m referring to public outreach, bringing in more members of the academic community, making people aware that LW even exists (I wasn’t except when I randomly ran into a few LWers in person), etc.
What’s the reason for downvoting this? Please comment.
As I’ve discussed with several LWers in person, including some staff and visiting fellows, one of the things I disliked about LW/SIAI was that so much of the resources of the organization go to pay the staff. They seemingly wouldn’t even consider proposals to spend a few hundred dollars on other things because they claimed it was “too expensive”.
add
Leader(s) are credited with expertise beyond that convenrional experts in subjects they are not conventionally qualified in.
Studying conventional versions of subjects is deprecated in favour of in group versions.
Also:
Associated with non-standard and non-monogamous sexual practices.
(Just some more pattern-matching on top of what you see in the parent and grandparent comment. I don’t actually think this is a strong positive indicator.)
The usual version of that indicator is “leader has sex with followers”
One fundamental difference between LW and most cults is that LW tells you to question everything, even itself.
Most, but not all. The Randians come to mind. Even the Buddha encouraged people to be critical, but doesn’t seem to have stopped the cults. I was floored to learn a few weeks ago that Buddhism has formalized even when you stop doubting! When you stop doubting, you become a Sotāpanna; a Sotāpanna is marked by abandoning ‘3 fetters’, the second fetter according to Wikipedia being
As well, as unquestioningness becomes a well known trait of cults, cults tend to try to hide it. Scientology hides the craziest dogmas until you’re well and hooked, for example.
If the Randians are a cult, LW is a cult.
Like the others, the members just think it’s unique in being valid.
If a person disagrees with Rand about a number of key beliefs, do they still count as a Randian?
If they don’t count as an Orthodox Randian, they can always become a Liberal Randian
That depends the largest part on what “a number of key beliefs” is.
Could you elaborate on this?
So there comes a point in Buddhism where you’re not supposed to be skeptical anymore. And Objectivists aren’t supposed to question Ayn Rand.
Would it be productive to be skeptical about whether your login really starts with the letter “M”? Taking an issue off the table and saying, we’re done with that, is not in itself a bad sign. The only question is whether they really do know what they think they know.
I personally endorse the very beginning of Objectivist epistemology—I mean this: “Existence exists—and the act of grasping that statement implies two corollary axioms: that something exists which one perceives and that one exists possessing consciousness, consciousness being the faculty of perceiving that which exists.” It’s the subsequent development which is a mix of further gemlike insights, paths not taken, and errors or uncertainties that are papered over.
In the case of Buddhism, one has the usual problem of knowing, at this historical distance, exactly what psychological and logical content defined “enlightenment”. One of its paradoxes is that it sounds like the experience of a phenomenological truth, and yet the key realization is often presented as the discovery that there is no true self or substantial self. I would have thought that achieving reflective consciousness implied the existence of a reflector, just as in the Objectivist account. Then again, reflection can also produce awareness that traits with which you have identified yourself are conditioned and contingent, so it can dissolve a naive concept of self, and that sounds more like the Buddhism we hear about today. The coexistence of a persistent observing consciousness, and a stream of transient identifications, in certain respects is like Hinduism; though the Buddhists can strike back by saying that the observing consciousness is not eternal and free of causality, it too exists only if it has been caused to exist.
So claims to knowledge, and the existence of a stage where you no longer doubt that this really is knowledge, and get on with developing the implications, do not in themselves imply falsity. In a systematic philosophy based on reason, a description which covers Objectivism, Buddhism, and Less-Wrong-ism, there really ought to be some notion of a development that occurs as you as learn.
The alternative is Zen Rationalism: if you meet a belief on the road (of life), doubt it! It’s a good heuristic if you are beset by nonsense, and it even has a higher form in phenomenological or experiential rationalism, where you test the truth of a proposition about consciousness by seeing whether you can plausibly deny it, even as the experience is happening. But if you do this, even while you keep returning to beginner’s mind, you should still be dialectically growing your genuine knowledge about the nature of reality.
There seems to be some detailed substructure there—which I go over here.
Not just a cult—an END OF THE WORLD CULT.
My favourite documentary on the topic: The End of The World Cult.