If one had public metrics of success at rationality, the usual status seeking and embarrassment avoidance could encourage people to actually apply their skills.
Shouldn’t a common-sense ‘success at life’ (money, status, free time, whatever) be the real metric of success at rationality? Shouldn’t a rationalist, as a General Inteligence, succeed over a non-rationalist in any chosen orderly environment, according to any chosen metric of success—including common metrics of that environment?
If “general intelligence” is a binary classification, almost everyone is one. If it’s continuous, rationalist and non-rationalist humans are indistinguishable next to AIXI.
You don’t know what the rationalist is optimizing for. Rationalists may even be less likely to value common-sense success metrics.
Even if those are someone’s goals, growth in rationality involves tradeoffs—investment of time, if nothing else—in the short term, but that may still be a long time.
Heck, if “rationality” is defined as anything other than “winning”, it might just not win for common-sense goals in some realistic environments.
People with the disposition to become rationalists may tend to also not be as naturally good at some things, like gaining status.
Agreed. Let’s throw away the phrase about General Intelligence—it’s not needed there.
Obviously, if we’re measuring one’s reality-steering performance we must know the target region (and perhaps some other parameters like planned time expenditure etc.) in advance.
The measurement should measure the performance of a rationalist at his/her current level, not taking into account time and resources he/she spent to level up. Measuring ‘the speed or efficiency of leveling-up in rationality’ is a different measurement.
The definitions at the beginning of the original post will do.
On one hand, the reality-mapping and reality-steering abilities should work for any activity, no matter whether the performer is hardware-accelerated for that activity or not. On the other hand, we should somehow take this into account—after all, excelling at things one is not hardware-accelerated for is a good indicator. (If only we could reliably determine who is hardware-accelerated for what).
(Edit: cool, it does numeric lists automatically!)
Public metrics aren’t enough—society must also care about them. Without that, there’s no status attached and no embarrassment risked.
To get this going, you’d also need a way to keep society’s standards on-track, or even a small amount of noise would lead to a positive feedback loop disrupting its conception of rationality.
Everyone has at least a little bit of rationality. Why not simply apply yourself to increasing it, and finding ways to make yourself implement its conclusions?
Just sit under the bodhi tree and decide not to move away until you’re better at implementing.
If one had public metrics of success at rationality, the usual status seeking and embarrassment avoidance could encourage people to actually apply their skills.
Shouldn’t a common-sense ‘success at life’ (money, status, free time, whatever) be the real metric of success at rationality? Shouldn’t a rationalist, as a General Inteligence, succeed over a non-rationalist in any chosen orderly environment, according to any chosen metric of success—including common metrics of that environment?
No.
If “general intelligence” is a binary classification, almost everyone is one. If it’s continuous, rationalist and non-rationalist humans are indistinguishable next to AIXI.
You don’t know what the rationalist is optimizing for. Rationalists may even be less likely to value common-sense success metrics.
Even if those are someone’s goals, growth in rationality involves tradeoffs—investment of time, if nothing else—in the short term, but that may still be a long time.
Heck, if “rationality” is defined as anything other than “winning”, it might just not win for common-sense goals in some realistic environments.
People with the disposition to become rationalists may tend to also not be as naturally good at some things, like gaining status.
Point-by-point:
Agreed. Let’s throw away the phrase about General Intelligence—it’s not needed there.
Obviously, if we’re measuring one’s reality-steering performance we must know the target region (and perhaps some other parameters like planned time expenditure etc.) in advance.
The measurement should measure the performance of a rationalist at his/her current level, not taking into account time and resources he/she spent to level up. Measuring ‘the speed or efficiency of leveling-up in rationality’ is a different measurement.
The definitions at the beginning of the original post will do.
On one hand, the reality-mapping and reality-steering abilities should work for any activity, no matter whether the performer is hardware-accelerated for that activity or not. On the other hand, we should somehow take this into account—after all, excelling at things one is not hardware-accelerated for is a good indicator. (If only we could reliably determine who is hardware-accelerated for what).
(Edit: cool, it does numeric lists automatically!)
Public metrics aren’t enough—society must also care about them. Without that, there’s no status attached and no embarrassment risked.
To get this going, you’d also need a way to keep society’s standards on-track, or even a small amount of noise would lead to a positive feedback loop disrupting its conception of rationality.
Everyone has at least a little bit of rationality. Why not simply apply yourself to increasing it, and finding ways to make yourself implement its conclusions?
Just sit under the bodhi tree and decide not to move away until you’re better at implementing.