Caring about things seems to make you interact with the world in more diverse ways (because you do this in addition to things other people do, not instead of); some of that translates into more experience and better models. But also tribal identity, mindkilling, often refusing to see the reasons why your straightforward solution would not work, and uncritical contrarianism.
Now I think about a group of people I know, who care strongly about improving the world, in the one or two aspects they focus on. They did a few amazing things and gained lots of skills; they publish books, organize big conferences, created a network of like-minded people in other countries; some of their activities are profitable, for others they apply for various grants and often get them, so some of them improve the world as a full-time job. They also believe that covid is a hoax, plus have lots of less fringe but still quite irrational beliefs. However… this depends on how you calculate the “total rationality”, but seems to me that their gains in near mode outweigh the losses in far mode, and in some sense I would call them more rational than average population.
Of course I dream about a group that would have all the advantages and none of the disadvantages.
Ah. I meant, I would like to see a group that has the sanity level of a typical rationalist, and the productivity level of these super-agenty irrationalists. (Instead of having to choose between “sane with lots of akrasia” and “awesome but insane”.)
Hm. Maybe there’s something to be gained from navigating ‘trade-offs’ differently? I thought perpetual motion machines were impossible (because thermodynamics) aside from ‘launch something into space, pointed away from stuff it would crash into’, though I’d read that ‘trying to do so is a good way to learn about physics., but I didn’t really try because I thought it’d be pointless.’ And then this happened.
Caring about things seems to make you interact with the world in more diverse ways (because you do this in addition to things other people do, not instead of); some of that translates into more experience and better models. But also tribal identity, mindkilling, often refusing to see the reasons why your straightforward solution would not work, and uncritical contrarianism.
Now I think about a group of people I know, who care strongly about improving the world, in the one or two aspects they focus on. They did a few amazing things and gained lots of skills; they publish books, organize big conferences, created a network of like-minded people in other countries; some of their activities are profitable, for others they apply for various grants and often get them, so some of them improve the world as a full-time job. They also believe that covid is a hoax, plus have lots of less fringe but still quite irrational beliefs. However… this depends on how you calculate the “total rationality”, but seems to me that their gains in near mode outweigh the losses in far mode, and in some sense I would call them more rational than average population.
Of course I dream about a group that would have all the advantages and none of the disadvantages.
It seems like the more people you know, the less likely this is.
Of both? (This sentence didn’t have a clear object.)
Ah. I meant, I would like to see a group that has the sanity level of a typical rationalist, and the productivity level of these super-agenty irrationalists. (Instead of having to choose between “sane with lots of akrasia” and “awesome but insane”.)
Hm. Maybe there’s something to be gained from navigating ‘trade-offs’ differently? I thought perpetual motion machines were impossible (because thermodynamics) aside from ‘launch something into space, pointed away from stuff it would crash into’, though I’d read that ‘trying to do so is a good way to learn about physics., but I didn’t really try because I thought it’d be pointless.’ And then this happened.