Most people here like science and everything around it so eliminating cognitive biases is EXTREMELY important in order to reach their goal.
Most people on the outside however are more obsessed about money or status and so are probably going to benefit from some degree of rationality, but anything else is probably dimnishing returns for one reason or another.ff
I’d say Eliezer put a higher standard on “Human” rather than what your average clubgoer thinks of.
Most people here like science and everything around it so eliminating cognitive biases is EXTREMELY important in order to reach their goal.
The potential rewards of epistemic rationality for a society are very high.
However, it doesn’t follow that everyone needs to be an epistemic rationalist, and it also doesn’t necessarily follow that anyone has to remove all their biases individually, since biases can be allowed to cancel out in collective rationality.
eliminating cognitive biases is EXTREMELY important in order to reach their goal...
I agree that we can define an evolutionary trait as a flaw or as perfect depending on what is the definition of success of the observer, but my options are between the “Human” and the biological justification. My objective is to work with the truth too!
We’d also need to separate the first from the second.
The first one probably has little advantage to reproduction (It would make sense to say that rationalists SHOULD be at the top of the social ladder but in fact that’s not often true.)
The second is probably your best bet of reproducing.
but my options are between the “Human” and the biological justification.
An epistemic rationalist is an instrumental rationalist who values truth. An instrumental rationalist values something else. If they value typical things like wealth and status, then there is some evidence that the winners in society have won by systematic winning. But epistemic rationalists don’t often win in those terms.
There’s a difference between the two in theory,because an idealized ageny either has true knowledge as a terminal value or not.
The extent to which a given agent can stick to instrumental rationality depends on its nature., how fuzzy or leaky it is. An instrumental rationalist that habitually gathers knowledge of no obvious use might mutate into what is FAPP an epistemic rationalist.
I cant see why experimentation should be more connected to IR than ER.
Depends on what you define success, actually.
Most people here like science and everything around it so eliminating cognitive biases is EXTREMELY important in order to reach their goal.
Most people on the outside however are more obsessed about money or status and so are probably going to benefit from some degree of rationality, but anything else is probably dimnishing returns for one reason or another.ff
I’d say Eliezer put a higher standard on “Human” rather than what your average clubgoer thinks of.
Again it depends on how you define success.
In other words, epistemic rationality is not instrumental rationality.
The potential rewards of epistemic rationality for a society are very high.
However, it doesn’t follow that everyone needs to be an epistemic rationalist, and it also doesn’t necessarily follow that anyone has to remove all their biases individually, since biases can be allowed to cancel out in collective rationality.
Are they for the individual, too? There’s only two parents for each child.
I agree that we can define an evolutionary trait as a flaw or as perfect depending on what is the definition of success of the observer, but my options are between the “Human” and the biological justification. My objective is to work with the truth too!
We’d also need to separate the first from the second.
The first one probably has little advantage to reproduction (It would make sense to say that rationalists SHOULD be at the top of the social ladder but in fact that’s not often true.)
The second is probably your best bet of reproducing.
I don’t understand this. Care to elaborate?
Above you mention two types of people:
But I am clarifying that my observation is from the point of view of human conscious objectives human may have vs biological objectives.
I think all evolved traits respond to biological goals and that we may regard some biases as flawed, but that’s from the human perspective.
If rationalist means instrumental rationalist, they often are winning.
I thought instrumental and epistemic rationality feed each other, no? Can you be one but not the other?
An epistemic rationalist is an instrumental rationalist who values truth. An instrumental rationalist values something else. If they value typical things like wealth and status, then there is some evidence that the winners in society have won by systematic winning. But epistemic rationalists don’t often win in those terms.
That doesn’t really answer the question though.
How can you make a plan (instrumental rationality) without having solid premises? (epistemic rationality)
How can you know what works and what not (epistemic rationality) if you haven’t tried something? (instrumental rationality)
There’s a difference between the two in theory,because an idealized ageny either has true knowledge as a terminal value or not.
The extent to which a given agent can stick to instrumental rationality depends on its nature., how fuzzy or leaky it is. An instrumental rationalist that habitually gathers knowledge of no obvious use might mutate into what is FAPP an epistemic rationalist.
I cant see why experimentation should be more connected to IR than ER.