What would happen if I got some friends together and we all decided to be really dedicatedly rational?
This is an important scenario to reason about if I want to be a rationalist, and I think my predictions about that scenario are more calibrated than they would be in a world where I didn’t read this post. Specifically, my predictions in light of this post have way, way fatter tails.
If your takeaway is only that you should have fatter tails on the outcomes of an aspiring rationality community, then I don’t object.
If “I got some friends together and we all decided to be really dedicatedly rational” is intended as a description of Ziz and co, I think it is a at least missing many crucial elements, and generally not a very good characterization.
It is intended as a description of Ziz and co, but with a couple caveats:
1) It was meant as a description that I could hypothetically pattern match to while getting sucked in to one of these, which meant no negative value judgements in the conditions, only in the observed outcomes.
2) It was meant to cast a wide net—hence the tails. When checking if my own activities could be spiraling into yet another rationalist cult, false positives of the form “2% yes- let’s look into that” are very cheap. It wasn’t meant as a way for me to police the activities of others since that’s a setting where false positives are expensive.
What would happen if I got some friends together and we all decided to be really dedicatedly rational?
I think it might depend a lot on whether there is one dominant person in the group who succeeds to convince the rest of you that “rationality” means accepting whatever that person says, because they are obviously the smartest one in the group (and probably in the entire universe).
If yes, then what happens is probably some combination of “what that person wants to happen” and “what that person needs to do in order to maintain their control over the group”.
What would happen if I got some friends together and we all decided to be really dedicatedly rational?
This is an important scenario to reason about if I want to be a rationalist, and I think my predictions about that scenario are more calibrated than they would be in a world where I didn’t read this post. Specifically, my predictions in light of this post have way, way fatter tails.
If your takeaway is only that you should have fatter tails on the outcomes of an aspiring rationality community, then I don’t object.
If “I got some friends together and we all decided to be really dedicatedly rational” is intended as a description of Ziz and co, I think it is a at least missing many crucial elements, and generally not a very good characterization.
It is intended as a description of Ziz and co, but with a couple caveats:
1) It was meant as a description that I could hypothetically pattern match to while getting sucked in to one of these, which meant no negative value judgements in the conditions, only in the observed outcomes.
2) It was meant to cast a wide net—hence the tails. When checking if my own activities could be spiraling into yet another rationalist cult, false positives of the form “2% yes- let’s look into that” are very cheap. It wasn’t meant as a way for me to police the activities of others since that’s a setting where false positives are expensive.
I think it might depend a lot on whether there is one dominant person in the group who succeeds to convince the rest of you that “rationality” means accepting whatever that person says, because they are obviously the smartest one in the group (and probably in the entire universe).
If yes, then what happens is probably some combination of “what that person wants to happen” and “what that person needs to do in order to maintain their control over the group”.