Good post, and upvoted, but I would phrase this part differently:
If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it.
The problem with a question like “What jobs make the most money” isn’t so much that it’s hard as that it’s vague (or if you want to be harsh, “meaningless”). The question “How much would you contribute to save an endangered species” is even worse—if I were to actually answer it (by, for example, saying “Exactly two hundred seven dollars!”), you would be terribly confused and have no idea what I meant.
Seems to be a social norm that anyone asking an interlocutor to clarify a question is nitpicking and annoying, even though the overwhelming majority of questions people debate are meaningless as asked. People get rejected as poor conversational partners if they ask “To save all individuals in the species, or just to ensure at least one breeding pair...and are we talking per year, or pretending we have a 100% chance of saving them forever?”, whereas if they say “We should pay whatever it takes!” they will be considered interesting even though that answer is clearly insane. It’s no wonder that most people avoid becoming rationalists in such a situation.
Whether biases come up in making decisions, or only in making conversation, seems to be a perennial question around here. Does anybody know of a canonical list of the ones which have demonstrated in actual-stakes-involved decision making?
Good post, and upvoted, but I would phrase this part differently:
The problem with a question like “What jobs make the most money” isn’t so much that it’s hard as that it’s vague (or if you want to be harsh, “meaningless”). The question “How much would you contribute to save an endangered species” is even worse—if I were to actually answer it (by, for example, saying “Exactly two hundred seven dollars!”), you would be terribly confused and have no idea what I meant.
Seems to be a social norm that anyone asking an interlocutor to clarify a question is nitpicking and annoying, even though the overwhelming majority of questions people debate are meaningless as asked. People get rejected as poor conversational partners if they ask “To save all individuals in the species, or just to ensure at least one breeding pair...and are we talking per year, or pretending we have a 100% chance of saving them forever?”, whereas if they say “We should pay whatever it takes!” they will be considered interesting even though that answer is clearly insane. It’s no wonder that most people avoid becoming rationalists in such a situation.
Whether biases come up in making decisions, or only in making conversation, seems to be a perennial question around here. Does anybody know of a canonical list of the ones which have demonstrated in actual-stakes-involved decision making?