Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that’s exactly what a cult would do, isn’t it?
To be accused is to be convicted, because the allegation is unfalsifiable.
Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.
The best way to win is to avoid the topic.
Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: “So, what exactly is that evil thing people on LW did? Downvote someone’s forum post? Seriously, guys, you need to get some life.”
And now, everybody stop worrying and get some life. ;-)
It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discussing best diet habits don’t seem like a doomsday cult, right?
The Sequences could be recommended somewhat differently, for example: “In this forum we sometimes discuss some complicated topics. To make the discussion more efficient and avoid endlessly repeating the same arguments about statistics, evolution, quantum mechanics, et cetera, it is recommended to read the Sequences.” Not like ‘you have to do this’, but rather like ‘read the FAQ, please’. Also in discussion, instead of “read the Sequences” it is better to recommend one specific sequence, or one article.
Relax, be friendly. But don’t hesitate to downvote a stupid post, even if the downvotee threatens to accuse you of whatever.
I don’t think the best way to win is to avoid the topic. A healthy discussion of false impressions and how to correct them, or other failings a group may have, is a good indication to me of a healthy community. This post for example caused my impression of LW to increase somewhat, but some of the responses to it have caused my impression to decrease below its original level.
Then let’s discuss “false impressions” or even better “impressions” in general, not focusing on cultishness, which even cannot be defined (because there are so many different kind of cults). If we focus on making things right, we do not have to discuss hundred ways they could go wrong.
What is our community (trying to be) like?
Friendly. In more senses of the word: we speak about ethics, we are trying to make a nice community, we try to help each other become stronger and win.
Rational. Instead of superstition and gossip, we discuss how and why things really happen. Instead of happy death spirals, we learn about the world around us.
Professional. By that I do not mean that everyone here is an AI expert, but that the things we do and value here (studying, politeness, exactness, science) are things that for most people correlate positively with their jobs, rather than free time. Even when we have fun, it’s adult people having fun.
So where exactly in the space of human organizations do we belong? Which of the cached-thoughts can be best applied to us? People will always try to fit us to some existing model (for example: cult), so why not choose this model rationally? I am not sure, but “educational NGO” sounds close. Science, raising the sanity waterline, et cetera. By seeming as something well-known, we become less suspicious, more normal.
The Sequences could be recommended somewhat differently, for example: “In this forum we sometimes discuss some complicated topics. To make the discussion more efficient and avoid endlessly repeating the same arguments about statistics, evolution, quantum mechanics, et cetera, it is recommended to read the Sequences.” Not like ‘you have to do this’, but rather like ‘read the FAQ, please’. Also in discussion, instead of “read the Sequences” it is better to recommend one specific sequence, or one article.
This.
Seriously, we need to start doing all the stuff recommended here, but this is perhaps the simplest and most immediate. Someone go do it.
No. The best way is to not be a cult. Since cults are the most efficient known ways of engendering irrational bias, and since LW’s mission statement is the avoidance of irrationality and bias, that is something LW should be doing anyway.
You misunderstand me. No-one is worshiping Yud-Suthoth and calling it rationality. You proposed, in essence, that we disregard everything connected with religion—this is precisely the fallacy “reversed stupidity is not intelligence” is intended to address. When this was pointed out, you responded with what can only be charitably interpreted as meaning “but religions are irrational!” which isn’t really addressing his point, or indeed any point, since no-one is proposing starting a religion or for that matter joining one.
No-one is worshiping Yud-Suthoth and calling it rationality.
Something like that is happenning. For instance, sending people off to the sequencs to find The Answer, when the sequences don’t even say anything conclusive.
this is precisely the fallacy “reversed stupidity is not intelligence” is intended to address.
The opposite of most sorts of stupid is still stupid. Particularly most things that are functional enough to proliferate themselves successfully.
Don’t have a leader
If you meant “Have more than one leader” you’d be on to something. That isn’t what you meant though.
Don″t have a gospel
There is a difference between the connotations you are going with for ‘gospel’ and what amounts to a textbook that most people haven’t read anyway.
Don’t have a dogma
I sometimes wish people would submit to reference to rudimentary references to rational, logical, decision theoretic or scientific concepts as if they were dogma. That is far from what I observe.
Don’t have quasi-religious “meetups”
Socialize in person with rudimentary organisation? Oh the horror!
Don’t have quasi-religious rituals (!)
Actually, I don’t disagree at all on this one. Or at least I’d prefer that anyone who was into that kind of thing did it without it being affiliated with lesswrong in any way except partial membership overlap.
Don’t have an eschatology
Are you complaining (or shaming with labels the observation) that an economist and an AI researcher attempted to use their respective expertise to make predictions about the future?
Don’t have a God.
Don’t. Working on it...
WELCOME CRITIICISM AND DISSENT
Most upvoted post. Welcome competent, sane or useful criticism. Don’t give nonsense a free pass just because it is ‘dissent’.
If you meant “Have more than one leader” you’d be on to something. That isn’t what you meant though.
How do you know? Multiple leaders at least dilute the problem.
There is a difference between the connotations you are going with for ‘gospel’ and what amounts to a textbook that most people haven’t read anyway.
I’ve read it. There’s some time I’ll never get back.
I sometimes wish people would submit to reference to rudimentary references to rational, logical, decision theoretic or scientific concepts as if they were dogma. T
Not what I meant. Those can be studied anywhere. “MWI is the correct interpretation of QM” is an example of dogma.
Socialize in person with rudimentary organisation?
Other rationalists manage without it.
Are you complaining (or shaming with labels the observation) that an economist and an AI researcher attempted to use their respective expertise to make predictions about the future?
No, I am referring to mind-killing aspects of the mythos: it fools people into thinking they are Saving the World This sense of self-importance is yet another mind killer. Instead of examining ideas dispassionaely,a s they should, they develop a mentality of “No, don’t take my important world-saving role away from me! I cannot tolerate any criticism of these ideas, because then I will go back to being an ordinary person”.
Don’t give nonsense a free pass just because it is ‘dissent’.
It contains five misspellings in a single paragraph: “utimately” “canot” “statees” “hvae” “ontoogical” which might themselves be enough for a downvote, regardless of content.
As for the is-ought problem, if we accept that “ought” is just a matter of calculations in our brain returning an output (and reject that it’s a matter of e.g. our brain receiving supernatural instruction from some non-physical soul), then the “ought” is describable in terms of the world-that-is, because every algorithm in our brain is describable in terms of the world-that-is.
It’s not a matter of “cramming” an entire world-state into your brain—any approximation that your brain is making, including any self-identified deficiency in the ability to make a moral evaluation in any particular situation, are also encoded in your brain—your current brain, not some hypothetical superbrain.
As for the is-ought problem, if we accept that “ought” is just a matter of calculations in our brain returning an output
But we shouldnt accept that, because we can miscalculate an “ought” or antyhing else. The is-ought problem is the problem of correctly inferring an ought from a tractable amount of “is’s”.
(and reject that it’s a matter of e.g. our brain receiving supernatural instruction from some non-physical soul), then the “ought” is describable in terms of the world-that-is, because every algorithm in our brain is describable in terms of the world-that-is.
It perhaps might be one day given sufficiently advanced brain scanning, but we don’t have that now, so we
still have an is-ought gap.
It’s not a matter of “cramming” an entire world-state into your brain—any approximation that your brain is making, including any self-identified deficiency in the ability to make a moral evaluation in any particular situation, are also encoded in your brain—your current brain, not some hypothetical superbrain.
The is-ought problem is epistemic. Being told that I have an epistemically inaccessible black box in my head that calculates oughts still doesn’t lead to a situation where oughts can be consciously undestood as correct entailments of is’s.
because we can miscalculate an “ought” or anything else.
One way to miscalculate an “ought” is the same way that we can miscalculate an “is”—e.g. lack of information, erroneous knowledge, false understanding of how to weigh data, etc.
And also, because people aren’t perfectly self-aware, we can mistake mere habits or strongly-held preferences to be the outputs of our moral algorithm—same way that e.g. a synaesthete might perceive the number 8 to be colored blue, even though there’s no “blue” light frequency striking the optical nerve. But that sort of thing doesn’t seem as a very deep philosophical problem to me.
We can correct miscalculations where we have an conscious epistemic grasp of how the calculation should work. If morality is a neural black box, we have no such grasp. Such a neural black box cannot be used to plug the is-ought gap, because it does not distinguish correct calculations from miscalculations.
Leaders are useful. Pretty much every cause/movement/group has leadership of some kind.
Don″t have a gospel
I’m not really sure how the sequences map onto the Christian Gospel. A catechism, maybe.
Don’t have a dogma
Assuming we don’t excommunicate people for disagreeing with it (politely), I’m not sure why not. I mean, we mostly agree that there’s no God, for example; rationality should, presumably, move us closer to the correct position, and if most of us agree that we’ve probably found it, why shouldn’t we assume members agree unless they indicate otherwise?
Or did you have a different meaning of “dogma” in mind?
Don’t have quasi-religious “meetups”
Because meeting people with similar interests and goals is only done via religion.
Don’t have quasi-religious rituals (!)
Has anyone who’s not a member of this site actually used those rituals as evidence of phygishness? Genuinely asking here.
Don’t have an eschatology
Because any idea that predicts the end of the world must be discarded a priori?
Don’t have a God.
Because any idea you place in the reference class “god” must be discarded a priori?
WELCOME CRITIICISM AND DISSENT
An excellent suggestion! In theory, we already do (we could probably do better on this.) Trolling, however, is not generally considered part of that.
Be the opposite of a religion.
I’m not even going to bother linking to the appropriate truism, but reversed stupidity etc.
EDIT: dammit stupid karma toll cutting off my discussions.
Leaders cause people to lapse into thinking “The Guru has an answer, even if I don’t understand it”. This is aready happening in LW.
I’m not really sure how the sequences map onto the Christian Gospel
People say “The answer is in the Sequencess” without bothering to check that it is.
,Assuming we don’t excommunicate people for disagreeing it (politely), I’m not sure why not. I mean, we mostly agree that there’s no God, for example; rationality should, presumably, move us closer to the correct position, and if most of us agree that we’ve probably found it, why shouldn’t we assume members agree unless they indicate otherwise?
Rationalists should think and argue. However LWers just say “this is wrong” and downvote.
Because meeting people with similar interests and goals is only done via religion.
Other ratioanlists manage withoiut them. LWers aren’t aware of how religious they seem.
Because any idea that predicts the end of the world must be discarded a priori?
Because it fools people into thinking they are Saving the World This sense of self-importance is yet another mind killer. Instead of examining ideas dispassionaely,a s they should, they develop a mentality of “No, don’t take my important world-saving role away from me! I cannot tolerate any criticism of these ideas, because then I will go back to being an ordinary person”.
Because any idea you place in the reference class “god” must be discarded a priori?
See above. Leads to over-estimation of individual importance, and therefore emotional investment, and therefore mind-killing.
An excellent suggestion! In theory, we already do (we could probably do better on this.)
Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that’s exactly what a cult would do, isn’t it?
To be accused is to be convicted, because the allegation is unfalsifiable.
Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.
The best way to win is to avoid the topic.
Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: “So, what exactly is that evil thing people on LW did? Downvote someone’s forum post? Seriously, guys, you need to get some life.”
And now, everybody stop worrying and get some life. ;-)
It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discussing best diet habits don’t seem like a doomsday cult, right?
The Sequences could be recommended somewhat differently, for example: “In this forum we sometimes discuss some complicated topics. To make the discussion more efficient and avoid endlessly repeating the same arguments about statistics, evolution, quantum mechanics, et cetera, it is recommended to read the Sequences.” Not like ‘you have to do this’, but rather like ‘read the FAQ, please’. Also in discussion, instead of “read the Sequences” it is better to recommend one specific sequence, or one article.
Relax, be friendly. But don’t hesitate to downvote a stupid post, even if the downvotee threatens to accuse you of whatever.
I’m having trouble thinking up examples of cults, real or fictional, that don’t take an interest in what their members eat and drink.
I don’t think the best way to win is to avoid the topic. A healthy discussion of false impressions and how to correct them, or other failings a group may have, is a good indication to me of a healthy community. This post for example caused my impression of LW to increase somewhat, but some of the responses to it have caused my impression to decrease below its original level.
Then let’s discuss “false impressions” or even better “impressions” in general, not focusing on cultishness, which even cannot be defined (because there are so many different kind of cults). If we focus on making things right, we do not have to discuss hundred ways they could go wrong.
What is our community (trying to be) like?
Friendly. In more senses of the word: we speak about ethics, we are trying to make a nice community, we try to help each other become stronger and win.
Rational. Instead of superstition and gossip, we discuss how and why things really happen. Instead of happy death spirals, we learn about the world around us.
Professional. By that I do not mean that everyone here is an AI expert, but that the things we do and value here (studying, politeness, exactness, science) are things that for most people correlate positively with their jobs, rather than free time. Even when we have fun, it’s adult people having fun.
So where exactly in the space of human organizations do we belong? Which of the cached-thoughts can be best applied to us? People will always try to fit us to some existing model (for example: cult), so why not choose this model rationally? I am not sure, but “educational NGO” sounds close. Science, raising the sanity waterline, et cetera. By seeming as something well-known, we become less suspicious, more normal.
This.
Seriously, we need to start doing all the stuff recommended here, but this is perhaps the simplest and most immediate. Someone go do it.
No. The best way is to not be a cult. Since cults are the most efficient known ways of engendering irrational bias, and since LW’s mission statement is the avoidance of irrationality and bias, that is something LW should be doing anyway.
Here’s how:
Don’t have a leader
Don″t have a gospel
Don’t have a dogma
Don’t have quasi-religious “meetups”
Don’t have quasi-religious rituals (!)
Don’t have an eschatology
Don’t have a God.
WELCOME CRITIICISM AND DISSENT
Suimmary:
Don’t be a religion of rationality. Be the opposite of a religion.
Reversed stupidity is not intelligence.
Worship of Yud-Suthoth is not rationality.
… and?
Intoning that “reversed stupidity is mot intelligence” is not going to switch any less wrongers brain back on.
You misunderstand me. No-one is worshiping Yud-Suthoth and calling it rationality. You proposed, in essence, that we disregard everything connected with religion—this is precisely the fallacy “reversed stupidity is not intelligence” is intended to address. When this was pointed out, you responded with what can only be charitably interpreted as meaning “but religions are irrational!” which isn’t really addressing his point, or indeed any point, since no-one is proposing starting a religion or for that matter joining one.
EDIT: sodding karma toll.
Something like that is happenning. For instance, sending people off to the sequencs to find The Answer, when the sequences don’t even say anything conclusive.
That’s just a slogan, not some universal law.
The opposite of most sorts of stupid is still stupid. Particularly most things that are functional enough to proliferate themselves successfully.
If you meant “Have more than one leader” you’d be on to something. That isn’t what you meant though.
There is a difference between the connotations you are going with for ‘gospel’ and what amounts to a textbook that most people haven’t read anyway.
I sometimes wish people would submit to reference to rudimentary references to rational, logical, decision theoretic or scientific concepts as if they were dogma. That is far from what I observe.
Socialize in person with rudimentary organisation? Oh the horror!
Actually, I don’t disagree at all on this one. Or at least I’d prefer that anyone who was into that kind of thing did it without it being affiliated with lesswrong in any way except partial membership overlap.
Are you complaining (or shaming with labels the observation) that an economist and an AI researcher attempted to use their respective expertise to make predictions about the future?
Don’t. Working on it...
Most upvoted post. Welcome competent, sane or useful criticism. Don’t give nonsense a free pass just because it is ‘dissent’.
How do you know? Multiple leaders at least dilute the problem.
I’ve read it. There’s some time I’ll never get back.
Not what I meant. Those can be studied anywhere. “MWI is the correct interpretation of QM” is an example of dogma.
Other rationalists manage without it.
No, I am referring to mind-killing aspects of the mythos: it fools people into thinking they are Saving the World This sense of self-importance is yet another mind killer. Instead of examining ideas dispassionaely,a s they should, they develop a mentality of “No, don’t take my important world-saving role away from me! I cannot tolerate any criticism of these ideas, because then I will go back to being an ordinary person”.
Is this nonsense ?
It contains five misspellings in a single paragraph: “utimately” “canot” “statees” “hvae” “ontoogical” which might themselves be enough for a downvote, regardless of content.
As for the is-ought problem, if we accept that “ought” is just a matter of calculations in our brain returning an output (and reject that it’s a matter of e.g. our brain receiving supernatural instruction from some non-physical soul), then the “ought” is describable in terms of the world-that-is, because every algorithm in our brain is describable in terms of the world-that-is.
It’s not a matter of “cramming” an entire world-state into your brain—any approximation that your brain is making, including any self-identified deficiency in the ability to make a moral evaluation in any particular situation, are also encoded in your brain—your current brain, not some hypothetical superbrain.
But we shouldnt accept that, because we can miscalculate an “ought” or antyhing else. The is-ought problem is the problem of correctly inferring an ought from a tractable amount of “is’s”.
It perhaps might be one day given sufficiently advanced brain scanning, but we don’t have that now, so we still have an is-ought gap.
The is-ought problem is epistemic. Being told that I have an epistemically inaccessible black box in my head that calculates oughts still doesn’t lead to a situation where oughts can be consciously undestood as correct entailments of is’s.
One way to miscalculate an “ought” is the same way that we can miscalculate an “is”—e.g. lack of information, erroneous knowledge, false understanding of how to weigh data, etc.
And also, because people aren’t perfectly self-aware, we can mistake mere habits or strongly-held preferences to be the outputs of our moral algorithm—same way that e.g. a synaesthete might perceive the number 8 to be colored blue, even though there’s no “blue” light frequency striking the optical nerve. But that sort of thing doesn’t seem as a very deep philosophical problem to me.
We can correct miscalculations where we have an conscious epistemic grasp of how the calculation should work. If morality is a neural black box, we have no such grasp. Such a neural black box cannot be used to plug the is-ought gap, because it does not distinguish correct calculations from miscalculations.
Leaders are useful. Pretty much every cause/movement/group has leadership of some kind.
I’m not really sure how the sequences map onto the Christian Gospel. A catechism, maybe.
Assuming we don’t excommunicate people for disagreeing with it (politely), I’m not sure why not. I mean, we mostly agree that there’s no God, for example; rationality should, presumably, move us closer to the correct position, and if most of us agree that we’ve probably found it, why shouldn’t we assume members agree unless they indicate otherwise?
Or did you have a different meaning of “dogma” in mind?
Because meeting people with similar interests and goals is only done via religion.
Has anyone who’s not a member of this site actually used those rituals as evidence of phygishness? Genuinely asking here.
Because any idea that predicts the end of the world must be discarded a priori?
Because any idea you place in the reference class “god” must be discarded a priori?
An excellent suggestion! In theory, we already do (we could probably do better on this.) Trolling, however, is not generally considered part of that.
I’m not even going to bother linking to the appropriate truism, but reversed stupidity etc.
EDIT: dammit stupid karma toll cutting off my discussions.
Leaders cause people to lapse into thinking “The Guru has an answer, even if I don’t understand it”. This is aready happening in LW.
People say “The answer is in the Sequencess” without bothering to check that it is.
Rationalists should think and argue. However LWers just say “this is wrong” and downvote.
Other ratioanlists manage withoiut them. LWers aren’t aware of how religious they seem.
Because it fools people into thinking they are Saving the World This sense of self-importance is yet another mind killer. Instead of examining ideas dispassionaely,a s they should, they develop a mentality of “No, don’t take my important world-saving role away from me! I cannot tolerate any criticism of these ideas, because then I will go back to being an ordinary person”.
See above. Leads to over-estimation of individual importance, and therefore emotional investment, and therefore mind-killing.
I’ll say
“Trolling” is the blind dogmatist’s term for reasoned criticism.
I> ’m not even going to bother linking to the appropriate truism, but reversed stupidity etc.
Stupidity is stupidity, too.