I pointed out to Roko by PM that his comment couldn’t be doing his cause any favors, but did not ask him to delete it, and would have discouraged him from doing so.
I’ve been asked to remove it as it could potentially be damaging.
like he’d gotten a stronger message from someone high up in SIAI—though of course, I probably like that theory because of the Bayesian Conspiracy aspects.
Would you mind PM’ing me (or just posting) the message you sent?
Also, does the above fit with your experiences at SIAI? I find it hard, but not impossible, to believe that Roko just described something akin to standard hiring procedure, and would very much like to hear an inside (and presumably saner) account.
Most people who actually work full-time for SIAI are too busy to read every comments thread on LW. In some cases, they barely read it at all. The wacky speculation here about SIAI is very odd—a simple visit in most cases would eliminate the need for it. Surely more than a hundred people have visited our facilities in the last few years, so plenty of people know what we’re really like in person. Not very insane or fanatical or controlling or whatever generates a good comic book narrative.
Please pardon my prying, but as you’ve spent more time with SIAI, have you seen tendencies toward this sort of thing? Public declarations, competitions/pressure to prove devotion to reducing existential risks, scolding for not towing the party line, etc.
I’ve seen evidence of fanaticism, but have always been confused about what the source is (did they start that way, or were they molded?).
Basically, I would very much like to know what your experience has been as you’ve gotten closer to SIAI.
I’m sure I’m not the only (past, perhaps future) donor would appreciate the air being cleared about this.
but as you’ve spent more time with SIAI, have you seen tendencies toward this sort of thing? Public declarations, competitions/pressure to prove devotion to reducing existential risks, scolding for not towing the party line, etc.
No; if anything, I see explicit advocacy, as Carl describes, against natural emergent fanaticism (see below), and people becoming less fanatical to the extent that they’re influenced by group norms. I don’t see emergent individual fanaticism generating significant unhealthy group dynamics like these. I do see understanding and advocacy of indirect utilitarianism as the proper way to ‘shut up and multiply’. I would be surprised if I saw any of the specific things you mention clearly going on, unless non-manipulatively advising people on how to live up to ideals they’ve already endorsed counts. I and others have at times felt uncomfortable pressure to be more altruistic, but this is mostly pressure on oneself — having more to do with personal fanaticism and guilt than group dynamics, let alone deliberate manipulation — and creating a sense of pressure is generally recognized as harmful.
I’ve seen evidence of fanaticism, but have always been confused about what the source is (did they start that way, or were they molded?).
I think the major source is that self-selection for taking the Singularity seriously, and for trying to do something about it, selects for bullet-biting dispositions that predispose towards fanaticism, which is then enabled by having a cause and a group to identify with. I don’t think this is qualitatively different from things that happen in other altruistic causes, just more common in SIAI due to much stronger selective pressure for bullet-biting.
I also have the impression that Singularitarian fanaticism in online discussions is more common among non-affiliated well-wishers than people who have spent time with SIAI (but there are more of the former category, so it’s not easy to tell).
I pointed out to Roko by PM that his comment couldn’t be doing his cause any favors, but did not ask him to delete it, and would have discouraged him from doing so.
I can’t be sure, but it sounded from:
like he’d gotten a stronger message from someone high up in SIAI—though of course, I probably like that theory because of the Bayesian Conspiracy aspects.
Would you mind PM’ing me (or just posting) the message you sent?
Also, does the above fit with your experiences at SIAI? I find it hard, but not impossible, to believe that Roko just described something akin to standard hiring procedure, and would very much like to hear an inside (and presumably saner) account.
Most people who actually work full-time for SIAI are too busy to read every comments thread on LW. In some cases, they barely read it at all. The wacky speculation here about SIAI is very odd—a simple visit in most cases would eliminate the need for it. Surely more than a hundred people have visited our facilities in the last few years, so plenty of people know what we’re really like in person. Not very insane or fanatical or controlling or whatever generates a good comic book narrative.
PMed the message I sent.
Certainly not anything like standard hiring procedure.
Thanks Nick.
Please pardon my prying, but as you’ve spent more time with SIAI, have you seen tendencies toward this sort of thing? Public declarations, competitions/pressure to prove devotion to reducing existential risks, scolding for not towing the party line, etc.
I’ve seen evidence of fanaticism, but have always been confused about what the source is (did they start that way, or were they molded?).
Basically, I would very much like to know what your experience has been as you’ve gotten closer to SIAI.
I’m sure I’m not the only (past, perhaps future) donor would appreciate the air being cleared about this.
No problem, and I welcome more such questions.
No; if anything, I see explicit advocacy, as Carl describes, against natural emergent fanaticism (see below), and people becoming less fanatical to the extent that they’re influenced by group norms. I don’t see emergent individual fanaticism generating significant unhealthy group dynamics like these. I do see understanding and advocacy of indirect utilitarianism as the proper way to ‘shut up and multiply’. I would be surprised if I saw any of the specific things you mention clearly going on, unless non-manipulatively advising people on how to live up to ideals they’ve already endorsed counts. I and others have at times felt uncomfortable pressure to be more altruistic, but this is mostly pressure on oneself — having more to do with personal fanaticism and guilt than group dynamics, let alone deliberate manipulation — and creating a sense of pressure is generally recognized as harmful.
I think the major source is that self-selection for taking the Singularity seriously, and for trying to do something about it, selects for bullet-biting dispositions that predispose towards fanaticism, which is then enabled by having a cause and a group to identify with. I don’t think this is qualitatively different from things that happen in other altruistic causes, just more common in SIAI due to much stronger selective pressure for bullet-biting.
I also have the impression that Singularitarian fanaticism in online discussions is more common among non-affiliated well-wishers than people who have spent time with SIAI (but there are more of the former category, so it’s not easy to tell).
I was there for a summer and don’t think I was ever even asked to donate money.
Ahh. I was trying to ask about Cialdini-style influence techniques.
Very little, if any.
What exactly is Roko’s cause by your estimation? I wasn’t aware he had one, at least in the secretive sense.
I meant SIAI.