Alright! a few points that I can sort of disagree on or feel were omitted in the essay. I’m being skeptical, not a cultist at all!
.
My fears aren’t really that you’re trying to foster a cult, or that it’s cultish to agree with you. I got worried when you said that you wanted more people to vocalize their agreement with you and actually work towards having a unified rationalist front. For some reason, I had this mental picture of you as a supervillain declaring your intention to take over the world.
So I reflected that I was doing things, somewhat unconventional things (which I focus on more) because of your advice, but hey, it’s good advise and I should probably take it (btw it’s good to hear that cryonics is less expensive than I thought it was, sorry for making your life difficult by propagating false information). I mean, I followed similar patterns when I decided to learn lisp as a first programming language.
I think I’m worried because you’re charismatic, and that makes you much more persuasive than an ineloquent and unimpressive philosopher/AI Hacker. Combined with the fact that I get really happy and a little self righteous when there’s an eloquent speaker who makes a really persuasive argument for something I agree with, makes reading you, and other charismatic people in the atheist/revolutionary/technophile cluster, a rather deep experience with uncomfortable parallels to religion.
I’ve thought it over though, and this particular pattern probably won’t cause too many problems, the reason is that Eliezer Yudkowsky isn’t the only eloquent speaker in the world. I’m betting on something similar to the “three stooges syndrome” where I get shaped by too many intelligent and charismatic people to be influenced in to making large mistakes, because they’ll probably call each other out on the more contentious claims and my bullshit detector will be reactivated.
I’m not even sure if I even agree with you more than average, but it does feel better to agree with you than usual, so that might be the source of worry, in your trip to the library of convenient rhetorical metaphors, it might be that the reason people are so anxious to say that they’re not copying you is because your deep, piercing stare and badass coffee metaphors
So other than that you’ve totally persuaded my fear of your ability to totally persuade me away. Well it’ll probably gone in a week after my subconscious stops thinking you’re a super-villain.
Alright! a few points that I can sort of disagree on or feel were omitted in the essay. I’m being skeptical, not a cultist at all! .
My fears aren’t really that you’re trying to foster a cult, or that it’s cultish to agree with you. I got worried when you said that you wanted more people to vocalize their agreement with you and actually work towards having a unified rationalist front. For some reason, I had this mental picture of you as a supervillain declaring your intention to take over the world. So I reflected that I was doing things, somewhat unconventional things (which I focus on more) because of your advice, but hey, it’s good advise and I should probably take it (btw it’s good to hear that cryonics is less expensive than I thought it was, sorry for making your life difficult by propagating false information). I mean, I followed similar patterns when I decided to learn lisp as a first programming language.
I think I’m worried because you’re charismatic, and that makes you much more persuasive than an ineloquent and unimpressive philosopher/AI Hacker. Combined with the fact that I get really happy and a little self righteous when there’s an eloquent speaker who makes a really persuasive argument for something I agree with, makes reading you, and other charismatic people in the atheist/revolutionary/technophile cluster, a rather deep experience with uncomfortable parallels to religion.
I’ve thought it over though, and this particular pattern probably won’t cause too many problems, the reason is that Eliezer Yudkowsky isn’t the only eloquent speaker in the world. I’m betting on something similar to the “three stooges syndrome” where I get shaped by too many intelligent and charismatic people to be influenced in to making large mistakes, because they’ll probably call each other out on the more contentious claims and my bullshit detector will be reactivated.
I’m not even sure if I even agree with you more than average, but it does feel better to agree with you than usual, so that might be the source of worry, in your trip to the library of convenient rhetorical metaphors, it might be that the reason people are so anxious to say that they’re not copying you is because your deep, piercing stare and badass coffee metaphors
So other than that you’ve totally persuaded my fear of your ability to totally persuade me away. Well it’ll probably gone in a week after my subconscious stops thinking you’re a super-villain.
Maybe you’ve been primed? (see the end of the post)