You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you? Especially since if you do start thinking coherently without discarding the absurd premise it will lead you to do, and advocate things that are potentially significantly detrimental to my goals.
To make it easier to answer we could rephrasing the question to the third person: “Wedrifid believes fundamental premise X. Calcsam has a very different fundamental premise Y which gives him different goals and different conclusions. This being the case how should wedrifid respond to behavioural exhortations given by calcsam on a rationalist blog? If wedrifid believed that all calcsam’s reasoning was sound except that which produced belief Y how would that change wedrifid’s incentives?”.
(‘Why should I listen to you?’ is still the basic question. The above just gives background detail to how it is relevant.)
You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you?
People who hold obviously incorrect beliefs can still be highly intelligent and productive:
Peter Duesberg (a professor of molecular and cell biology at the University of California, Berkeley) “claimed that AIDS is not caused by HIV, which made him so unpopular that his colleagues and others have — until recently — been ignoring his potentially breakthrough work on the causes of cancer.”
Francisco J. Ayala who “…has been called the “Renaissance Man of Evolutionary Biology” is a geneticist ordained as a Dominican priest. “His “discoveries have opened up new approaches to the prevention and treatment of diseases that affect hundreds of millions of individuals worldwide…”
Francis Collins (geneticist, Human Genome Project) noted for his landmark discoveries of disease genes and his leadership of the Human Genome Project (HGP) and described by the Endocrine Society as “one of the most accomplished scientists of our time” is a evangelical Christian.
Georges Lemaître (a Belgian Roman Catholic priest) proposed what became known as the Big Bang theory of the origin of the Universe.
Kurt Gödel (logician, mathematician and philosopher) who suffered from paranoia and believed in ghosts. “Gödel, by contrast, had a tendency toward paranoia. He believed in ghosts; he had a morbid dread of being poisoned by refrigerator gases; he refused to go out when certain distinguished mathematicians were in town, apparently out of concern that they might try to kill him.”
There are many more examples. All of them are outliers indeed, and I don’t think that calcsam has been able to prove that his achievements and general capability to think clearly in some fields does outweigh the heavy burden of being religious. Yet there is evidence that such people do exist and he offers you the chance to challenge him.
Generally I agree with you, but I also think that calcsam provides a fascinating example of the internal dichotomy of some human minds and a case study that might provide insights to how the arguments employed by Less Wrong fail in some cases.
Maybe we should make a list on the wiki? eg. I’m tempted to add Aumann, but as pointed out, ‘There are many more examples’ and XiXiDu made his point with the short list.
People who hold obviously incorrect beliefs can still be highly intelligent and productive:
And one of the concerns I detected in wedrifid’s comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won’t like.
I kind of think that’s already happening. All over the place. All the time. What kind of policy implications did you want to draw from it in this particular instance?
No amount of clear thinking elsewhere can excuse you from being wrong about this one thing. To think so is to treat being right and wrong like a social game, where people with high status gets a free pass on questions with actual answers.
And one of the concerns I detected in wedrifid’s comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won’t like.
Exactly! If beliefs like this are just used as verbal symbols for navigating the social world they do relatively minor harm. Once someone with the intelligence, productivity and otherwise rational thinking necessary comes to follow the belief to the logical conclusion comes along things start exploding. Or rationalist communities become modified in a direction that makes them either less pleasant or less effective than I would prefer.
Good reminder that reversed stupidity is not intelligence.
Adding to the list: Hans Berger invented the EEG while trying to investigate telepathy, which he was convinced was real. Even fools can make important discoveries.
I don’t think that examples of people with fundamental, irrational beliefs being good at other things are relevant—calcsam has invited questions specifically about the belief whose rationality is being examined. If he was starting a discussion about mathematics and his points were dismissed due to his Mormon affiliation, your comment wold make more sense to me.
I think though that holding crazy beliefs is Bayesian evidence for the hypothesis that a person is not a remarkable intellectual contributor to humanity. Wedrifid’s “why should I listen to you?” is thus not addressed head-on by a list of crazy people who happened to achieve other worthy stuff.
If we had no other information about calcsam besides eir religious beliefs, and e were only one of many people potentially worth listening to, and we were processing those many in bulk to try to decide which of them to investigate more expensively closely, then this would be a useful low-cost filter.
However, I don’t think it’s enough evidence to overcome the other things we do know about em: that e’s posting on LW, that e’s responding in a generally clear and intelligent manner, etc.
A policy of ignoring people who disagree with you seems like a good way to never notice that you’re wrong. And you are wrong—not necessarily about this particular question, but of all the things you believe there’s pretty much guaranteed to be at least one false idea. I’d even go so far as to say that there’s probably at least one very important wrong idea in there.
In my opinion, listening to people like calcsam—intelligent people who disagree with me—is one of the most plausible vectors for finding out that I’m wrong about something.
No, and I take a mild degree of offence at the accusation. Ask Me Anything taken literally. It is exactly what the ‘elephant in the room’ is. I am being frank, not adversarial and given calcsam’s experiences and the emotional resilience that he would have needed to develop while evangelizing I know I don’t have to tiptoe through a minefield to protect his feelings.
If I am obliged to maintain a social facade even in a thread specifically created to asking this question then the only real recourse I would have is to do whatever is appropriate to eliminate the necessity for me to speak bullshit (or act in a misleading way that is analogous to bullshit).
I do not object to the subject of your question, but the way you put it. I think this
You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined.
Is what I was reacting to.
Presumably, he disputes that, so for the purposes of your conversation it is not ‘clear’. Phrasing this same sentiment as ‘I do not believe you are capable of thinking rationally …, and you will have to convince me otherwise before I listen to you’ or something along those lines would be a less adversarial way of asking this question. For example, I think Costanza asks roughly the same question below in a frank way.
I do not object to the subject of your question, but the way you put it.
I differ in that I do object to the subject of User:wedrifid’s question, in particular, the part you just excerpted.
If being B1 refuses to update to being B2′s beliefs on account of B2 being stupid, and this judgment of B2′s stupidity, in turn, is solely based on B2 satisfying B1 =/= B2, then B1 is “begging the question” (assuming a conclusion to prove it).
There are very good arguments to reject religious beliefs; however, when one uses the argument that an exponent of one of them is stupid because they so believe and therefore must not be worth listening to, then one has desensitized one’s worldmodel to evidence, locking in any errors one current subscribes to—and this remains true even if B2 is pure error.
No belief system or decision theory can be judged solely relative to itself; otherwise, it would be impossible to change one’s beliefs or decision theory. Because the fact that one possesses a belief system is not definitive evidence of its truth, any belief system must permit situations in which it would update, or else it will indefinitely reproduce the same errors under reflection.
User:wedrifid makes the error in this statement, no matter how well its phrasing is changed to accord with human customs and status systems:
You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined.
User:wedrifid makes the error in this statement, no matter how well its phrasing is changed to accord with human customs and status systems:
User:jsalvatier expressed an objectionable opinion, made a (very mildly) offensive accusation and used dubiously selective quoting for the purpose of supporting his argument. Yet Clippy is wrong as a simple matter of fact, which is far worse. The parent presents a a straw man. Clippy has made an error while parsing the comment text.
An incorrect processing of language and concepts by Clippy is evidence against the possibility of Clippy gaining dominance of the world and light cone. This lowers the threat of potential punishment or reprisal by Clippy if I do things that destroy paperclips. As such the probability that I destroy my paperclips to, for example, create lockpicks has increased.
Show how the position I attributed to you differs from the position you actually took.
This is a form of question that is usually unreasonable to ask. It places a burden on the recipient of the straw man of trying to guess what on earth the speaker was thinking to make them think they were the same in the first place. It is a rare instance where the worse the misrepresentation is the harder it is to demonstrate exactly why. Sometimes you just have to say “Yes, I know Chewbacca is a wookie but why on earth do you think that means he’s a scarecrow?”
In this case I can at least point to some of the bits that don’t match.
If being B1 refuses to update to being B2′s beliefs on account of B2 being stupid, and this judgment of B2′s stupidity, in turn, is solely based on B2 satisfying B1 =/= B2, then B1 is “begging the question” (assuming a conclusion to prove it).
‘Refuses to update’ doesn’t come into it. “Questioning the expected value of listening to advice from” would fit or even “Considering the possibility that absorbing the advice of someone with different values could result in net disutility”.
The ‘begging the question’ part verges on ‘too nonsensical for a diff to even produce compression’ (ie. They are just two completely different things.) A recursive evaluation of the plausibility of the Mormon beliefs to questionable thinking back to Mormon beliefs being implausible just isn’t going on. Calcsam has been rather careful not to (look like he is trying to) persuade people about his brand of religion. The relevance in terms of epistemic value would be from a possible association between the beliefs of a religious group and the beliefs of one of their missionaries about how rationalists should behave.
Not only did I not beg the question I didn’t even privilege the hypothesis enough to ask it. I don’t go around thinking “I have no particular evidence singling it out from all the other supersitions but what if the Mormon spinoff religion is the ultimate source of Truth?”
This is a form of question that is usually unreasonable to ask.
If you perceive it as unreasonable to be asked to explain how your position differs from the one attributed to you, then you almost certainly have insufficient grounds to accuse others of strawmanning. If you really are being strawmanned, you can just say, “I said XY. You claimed I just said X.” Because there is no such difference you can point to, that should have made you extremely hesistant to diagnose errors you feel I made as being type:strawman.
(Strangely, you seem to think that the bigger the difference, the more unreasonable the request for proof of strawmanning, as when you say “too nonsensical for a diff to even produce compression”—a diff failing to produce compression would make your job easier and your claim stronger!)
‘Refuses to update’ doesn’t come into it. “Questioning the expected value of listening to advice from” would fit [...]
The distinction between the two is not large enough to justify claiming that my point was irrelevant at strawman level. Whether you are refusing to update, or refusing to listen to things on the basis that they are intended to persuade you to update, is irrelevant, and the fact that my argument specifically called out only one of those does not thereby make it a strawman.
It is not enough that I failed to use a full blockquote of the your remarks, there must be substantive mis-attribution before a strawmanning claim is justified.
The ‘begging the question’ part [...]
Not only did I not beg the question [...]
Whether or not you begged the question is irrelevant to your claim of being strawmanned. That you begged the question was an argument I made. Proving that you didn’t beg the question would do nothing to prove I misrepresented your position—only that my argument regarding your position is wrong.
You seem to be making the common human error of equating, “You made arguments against my position I find to be in error” with “you responded to a position I never took.”
It is unfortunate that we cannot spend more time at the object level since this baseless charge of misattribution must be resolved first. Please do not make such claims in the future unless you can prove it with “I said XY. You claimed I just said X” or something of similar simplicity. Rather, focus on the object level without bringing in the additional distraction of whether you were misrepresented.
I disagree with most of what you are saying here and, evidently, do not share your mode of thought. I hope you agree that us conversing further would do more harm than good. I think I preferred it when you stuck to “I like paperclips and MS Word” joke reruns.
By the way, User:Jasen is racist and so didn’t admit me to the rationalist bootcamp.
It could also be that Jasen simply prefers humans who apply sincerely over humans who send applications based on a joke account persona when it comes to allocating training resources. That is probably not an unusual prejudice.
Phrasing this same sentiment as ‘I do not believe you are capable of thinking rationally …, and you will have to convince me otherwise before I listen to you’
Ironically those suggestions convey a worse picture of of the opening poster and declare a stricter requirement for what it would take for me to listen. My observation clearly indicated both in the quote you made and in my following paragraph that the flawed thinking is with respect to the religious belief. Further, I don’t think (and didn’t suggest that) the OP would need to convince me of a specific kind of rational thinking in order for it to be worth listening. Instead I gave him a platform from which to enumerate reasons. The best of those reasons would actually speak of potential instrumental value and not epistemic awesomeness.
Adding “I do not believe” before a statement is actually just redundant a kind of false humility. Eliezer actually wrote a post that touched on this specifically, does anyone recall the reference?
Yes. But the reason why we should listen to him is self-evident. He has written things that are valuable. If he maintains his interest in the community here, and the quality is good, he could be a value-multiplier. A catalyst. His writing here is the intersecting part of a Venn diagram, his interests overlapping with Less Wrong.
His allusions to his missionary work are provoking an immune response from many here, including me (not that I write much). I think this is why (from a quote thread):
What frightens us most in a madman is his sane conversation.
—Anatole France
His allusions to his missionary work are provoking an immune response from many here, including me (not that I write much). I think this is why (from a quote thread):
I have not been particularly bothered by the missionary allusions but obviously don’t consider the posts nearly as valuable as you do. There is an undesirable emphasis on norms and a constant pressure to move things in the direction of ‘making the group do set projects’ and ‘consensus’. This isn’t an organisation, it’s a blog.
Some of us would like a %$^&ing organization, pardon my French.
You have one.
Injecting LW with a pint of blood from a religious Behemoth will not give you another organisation, charged up with the power of divine effectiveness. It’ll cause an autoimmune disease, doing serious neurological damage and causing externally visible disfigurement (unnecessarily cultish vibe), scaring healthy potential recruits away.
If you want to actually enhance the potential practical effectiveness of LW and LW spinoff communities instead take the quickening of an entrepreneur. Or at very least track down and feast on the essence of a successful business professional and an economist or two.
Food for Thought: Holy Books usually don’t get implemented at all. Which is usually a good thing. What mainstream religious authorities do when ‘implementing Holy Books’ is something quite different from implementing holy books—and not something that is necessarily desirable to emulate.
For the sake the question you could answer as though it is something like “given that wedrifid believes X thing that I don’t believe how should he behave?”
I completely failed to parse this sentence (and so didn’t really understand the next one either.) Could you try phrasing it another way and/or correcting typos, if they’re in there?
I completely failed to parse this sentence (and so didn’t really understand the next one either.) Could you try phrasing it another way and/or correcting typos, if they’re in there?
I edited the paragraph. The meaning is approximately the same but far clearer.
You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you? Especially since if you do start thinking coherently without discarding the absurd premise it will lead you to do, and advocate things that are potentially significantly detrimental to my goals.
To make it easier to answer we could rephrasing the question to the third person: “Wedrifid believes fundamental premise X. Calcsam has a very different fundamental premise Y which gives him different goals and different conclusions. This being the case how should wedrifid respond to behavioural exhortations given by calcsam on a rationalist blog? If wedrifid believed that all calcsam’s reasoning was sound except that which produced belief Y how would that change wedrifid’s incentives?”.
(‘Why should I listen to you?’ is still the basic question. The above just gives background detail to how it is relevant.)
People who hold obviously incorrect beliefs can still be highly intelligent and productive:
Peter Duesberg (a professor of molecular and cell biology at the University of California, Berkeley) “claimed that AIDS is not caused by HIV, which made him so unpopular that his colleagues and others have — until recently — been ignoring his potentially breakthrough work on the causes of cancer.”
Francisco J. Ayala who “…has been called the “Renaissance Man of Evolutionary Biology” is a geneticist ordained as a Dominican priest. “His “discoveries have opened up new approaches to the prevention and treatment of diseases that affect hundreds of millions of individuals worldwide…”
Francis Collins (geneticist, Human Genome Project) noted for his landmark discoveries of disease genes and his leadership of the Human Genome Project (HGP) and described by the Endocrine Society as “one of the most accomplished scientists of our time” is a evangelical Christian.
Georges Lemaître (a Belgian Roman Catholic priest) proposed what became known as the Big Bang theory of the origin of the Universe.
Kurt Gödel (logician, mathematician and philosopher) who suffered from paranoia and believed in ghosts. “Gödel, by contrast, had a tendency toward paranoia. He believed in ghosts; he had a morbid dread of being poisoned by refrigerator gases; he refused to go out when certain distinguished mathematicians were in town, apparently out of concern that they might try to kill him.”
There are many more examples. All of them are outliers indeed, and I don’t think that calcsam has been able to prove that his achievements and general capability to think clearly in some fields does outweigh the heavy burden of being religious. Yet there is evidence that such people do exist and he offers you the chance to challenge him.
Generally I agree with you, but I also think that calcsam provides a fascinating example of the internal dichotomy of some human minds and a case study that might provide insights to how the arguments employed by Less Wrong fail in some cases.
I think these kinds of list should always include Donald E. Knuth.
Maybe we should make a list on the wiki? eg. I’m tempted to add Aumann, but as pointed out, ‘There are many more examples’ and XiXiDu made his point with the short list.
I made the list at http://wiki.lesswrong.com/wiki/Irrationalists
More suggestions welcome. I think I’m going to make a Discussion article on this to get a little more visibility.
And one of the concerns I detected in wedrifid’s comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won’t like.
I kind of think that’s already happening. All over the place. All the time. What kind of policy implications did you want to draw from it in this particular instance?
My inclination would be to discourage posts with undertones of religious propaganda on this site.
Hmm, what policy...
No amount of clear thinking elsewhere can excuse you from being wrong about this one thing. To think so is to treat being right and wrong like a social game, where people with high status gets a free pass on questions with actual answers.
Could you please be more specific? What sort of action is being taken here as a result of your worry?
Not voting for religious candidates for Australian Parliament elections.
Exactly! If beliefs like this are just used as verbal symbols for navigating the social world they do relatively minor harm. Once someone with the intelligence, productivity and otherwise rational thinking necessary comes to follow the belief to the logical conclusion comes along things start exploding. Or rationalist communities become modified in a direction that makes them either less pleasant or less effective than I would prefer.
Good reminder that reversed stupidity is not intelligence.
Adding to the list: Hans Berger invented the EEG while trying to investigate telepathy, which he was convinced was real. Even fools can make important discoveries.
But increasing one’s foolishness does not increase the expected rate of discovery.
I don’t think that examples of people with fundamental, irrational beliefs being good at other things are relevant—calcsam has invited questions specifically about the belief whose rationality is being examined. If he was starting a discussion about mathematics and his points were dismissed due to his Mormon affiliation, your comment wold make more sense to me.
I think though that holding crazy beliefs is Bayesian evidence for the hypothesis that a person is not a remarkable intellectual contributor to humanity. Wedrifid’s “why should I listen to you?” is thus not addressed head-on by a list of crazy people who happened to achieve other worthy stuff.
If we had no other information about calcsam besides eir religious beliefs, and e were only one of many people potentially worth listening to, and we were processing those many in bulk to try to decide which of them to investigate more expensively closely, then this would be a useful low-cost filter.
However, I don’t think it’s enough evidence to overcome the other things we do know about em: that e’s posting on LW, that e’s responding in a generally clear and intelligent manner, etc.
A policy of ignoring people who disagree with you seems like a good way to never notice that you’re wrong. And you are wrong—not necessarily about this particular question, but of all the things you believe there’s pretty much guaranteed to be at least one false idea. I’d even go so far as to say that there’s probably at least one very important wrong idea in there.
In my opinion, listening to people like calcsam—intelligent people who disagree with me—is one of the most plausible vectors for finding out that I’m wrong about something.
So XiXiDu’s negative quotes file is not limited to just Eliezer.
Too adversarial.
No, and I take a mild degree of offence at the accusation. Ask Me Anything taken literally. It is exactly what the ‘elephant in the room’ is. I am being frank, not adversarial and given calcsam’s experiences and the emotional resilience that he would have needed to develop while evangelizing I know I don’t have to tiptoe through a minefield to protect his feelings.
If I am obliged to maintain a social facade even in a thread specifically created to asking this question then the only real recourse I would have is to do whatever is appropriate to eliminate the necessity for me to speak bullshit (or act in a misleading way that is analogous to bullshit).
I do not object to the subject of your question, but the way you put it. I think this
Is what I was reacting to.
Presumably, he disputes that, so for the purposes of your conversation it is not ‘clear’. Phrasing this same sentiment as ‘I do not believe you are capable of thinking rationally …, and you will have to convince me otherwise before I listen to you’ or something along those lines would be a less adversarial way of asking this question. For example, I think Costanza asks roughly the same question below in a frank way.
I differ in that I do object to the subject of User:wedrifid’s question, in particular, the part you just excerpted.
If being B1 refuses to update to being B2′s beliefs on account of B2 being stupid, and this judgment of B2′s stupidity, in turn, is solely based on B2 satisfying B1 =/= B2, then B1 is “begging the question” (assuming a conclusion to prove it).
There are very good arguments to reject religious beliefs; however, when one uses the argument that an exponent of one of them is stupid because they so believe and therefore must not be worth listening to, then one has desensitized one’s worldmodel to evidence, locking in any errors one current subscribes to—and this remains true even if B2 is pure error.
No belief system or decision theory can be judged solely relative to itself; otherwise, it would be impossible to change one’s beliefs or decision theory. Because the fact that one possesses a belief system is not definitive evidence of its truth, any belief system must permit situations in which it would update, or else it will indefinitely reproduce the same errors under reflection.
User:wedrifid makes the error in this statement, no matter how well its phrasing is changed to accord with human customs and status systems:
It isn’t based solely on that—that is what
means.
User:jsalvatier expressed an objectionable opinion, made a (very mildly) offensive accusation and used dubiously selective quoting for the purpose of supporting his argument. Yet Clippy is wrong as a simple matter of fact, which is far worse. The parent presents a a straw man. Clippy has made an error while parsing the comment text.
An incorrect processing of language and concepts by Clippy is evidence against the possibility of Clippy gaining dominance of the world and light cone. This lowers the threat of potential punishment or reprisal by Clippy if I do things that destroy paperclips. As such the probability that I destroy my paperclips to, for example, create lockpicks has increased.
Show how the position I attributed to you differs from the position you actually took.
This is a form of question that is usually unreasonable to ask. It places a burden on the recipient of the straw man of trying to guess what on earth the speaker was thinking to make them think they were the same in the first place. It is a rare instance where the worse the misrepresentation is the harder it is to demonstrate exactly why. Sometimes you just have to say “Yes, I know Chewbacca is a wookie but why on earth do you think that means he’s a scarecrow?”
In this case I can at least point to some of the bits that don’t match.
‘Refuses to update’ doesn’t come into it. “Questioning the expected value of listening to advice from” would fit or even “Considering the possibility that absorbing the advice of someone with different values could result in net disutility”.
The ‘begging the question’ part verges on ‘too nonsensical for a diff to even produce compression’ (ie. They are just two completely different things.) A recursive evaluation of the plausibility of the Mormon beliefs to questionable thinking back to Mormon beliefs being implausible just isn’t going on. Calcsam has been rather careful not to (look like he is trying to) persuade people about his brand of religion. The relevance in terms of epistemic value would be from a possible association between the beliefs of a religious group and the beliefs of one of their missionaries about how rationalists should behave.
Not only did I not beg the question I didn’t even privilege the hypothesis enough to ask it. I don’t go around thinking “I have no particular evidence singling it out from all the other supersitions but what if the Mormon spinoff religion is the ultimate source of Truth?”
If you perceive it as unreasonable to be asked to explain how your position differs from the one attributed to you, then you almost certainly have insufficient grounds to accuse others of strawmanning. If you really are being strawmanned, you can just say, “I said XY. You claimed I just said X.” Because there is no such difference you can point to, that should have made you extremely hesistant to diagnose errors you feel I made as being type:strawman.
(Strangely, you seem to think that the bigger the difference, the more unreasonable the request for proof of strawmanning, as when you say “too nonsensical for a diff to even produce compression”—a diff failing to produce compression would make your job easier and your claim stronger!)
The distinction between the two is not large enough to justify claiming that my point was irrelevant at strawman level. Whether you are refusing to update, or refusing to listen to things on the basis that they are intended to persuade you to update, is irrelevant, and the fact that my argument specifically called out only one of those does not thereby make it a strawman.
It is not enough that I failed to use a full blockquote of the your remarks, there must be substantive mis-attribution before a strawmanning claim is justified.
Whether or not you begged the question is irrelevant to your claim of being strawmanned. That you begged the question was an argument I made. Proving that you didn’t beg the question would do nothing to prove I misrepresented your position—only that my argument regarding your position is wrong.
You seem to be making the common human error of equating, “You made arguments against my position I find to be in error” with “you responded to a position I never took.”
It is unfortunate that we cannot spend more time at the object level since this baseless charge of misattribution must be resolved first. Please do not make such claims in the future unless you can prove it with “I said XY. You claimed I just said X” or something of similar simplicity. Rather, focus on the object level without bringing in the additional distraction of whether you were misrepresented.
I disagree with most of what you are saying here and, evidently, do not share your mode of thought. I hope you agree that us conversing further would do more harm than good. I think I preferred it when you stuck to “I like paperclips and MS Word” joke reruns.
OK.
By the way, User:Jasen is racist and so didn’t admit me to the rationalist bootcamp.
It could also be that Jasen simply prefers humans who apply sincerely over humans who send applications based on a joke account persona when it comes to allocating training resources. That is probably not an unusual prejudice.
Then why did User:Jasen advance me to Part 2 of the process?
Ironically those suggestions convey a worse picture of of the opening poster and declare a stricter requirement for what it would take for me to listen. My observation clearly indicated both in the quote you made and in my following paragraph that the flawed thinking is with respect to the religious belief. Further, I don’t think (and didn’t suggest that) the OP would need to convince me of a specific kind of rational thinking in order for it to be worth listening. Instead I gave him a platform from which to enumerate reasons. The best of those reasons would actually speak of potential instrumental value and not epistemic awesomeness.
Adding “I do not believe” before a statement is actually just redundant a kind of false humility. Eliezer actually wrote a post that touched on this specifically, does anyone recall the reference?
You could be thinking of Qualitatively Confused—though that post is mostly about how ‘believe’ is not quite redundant.
Yes. But the reason why we should listen to him is self-evident. He has written things that are valuable. If he maintains his interest in the community here, and the quality is good, he could be a value-multiplier. A catalyst. His writing here is the intersecting part of a Venn diagram, his interests overlapping with Less Wrong.
His allusions to his missionary work are provoking an immune response from many here, including me (not that I write much). I think this is why (from a quote thread):
I have not been particularly bothered by the missionary allusions but obviously don’t consider the posts nearly as valuable as you do. There is an undesirable emphasis on norms and a constant pressure to move things in the direction of ‘making the group do set projects’ and ‘consensus’. This isn’t an organisation, it’s a blog.
Some of us would like a %$^&ing organization, pardon my French.
You have one.
Injecting LW with a pint of blood from a religious Behemoth will not give you another organisation, charged up with the power of divine effectiveness. It’ll cause an autoimmune disease, doing serious neurological damage and causing externally visible disfigurement (unnecessarily cultish vibe), scaring healthy potential recruits away.
If you want to actually enhance the potential practical effectiveness of LW and LW spinoff communities instead take the quickening of an entrepreneur. Or at very least track down and feast on the essence of a successful business professional and an economist or two.
Food for Thought: Holy Books usually don’t get implemented at all. Which is usually a good thing. What mainstream religious authorities do when ‘implementing Holy Books’ is something quite different from implementing holy books—and not something that is necessarily desirable to emulate.
I completely failed to parse this sentence (and so didn’t really understand the next one either.) Could you try phrasing it another way and/or correcting typos, if they’re in there?
I edited the paragraph. The meaning is approximately the same but far clearer.
Referring to the original poster, wedrifid wrote:
In part because LessWrong has no kill-file or kill-filter technology.
The greasmonkey script to implement that functionality is steadily moving towards the top of the todo list. :p