I guess you don’t need to know anything from the HP canon. It could perhaps be even more interesting that way. I don’t think you would learn new information. It might have a better emotional impact, but that is difficult to predict.
wouldn’t this just mean that rational minds usually pursue other goals than writing fiction? Not saying that there shouldn’t be rationalist fiction, but this doesn’t sound like such a bad state of affairs to me.
I would consider the world better if there were more rational people sharing the same values as me. We could cooperate on mutual goals, and learn from each other.
Problem is, rational people don’t just appear randomly in the world. Okay, sometimes they do, but the process if far from optimal. If there is a chance to make rationality spread more reliable, we should try.
But we don’t exactly know how. We tried many things, with partial success. For example the school system—it is great in taking an illiterate peasant population and producing an educated population within a century. But it has some limits: students learn to guess their teachers’ passwords, there are not enough sufficiently skilled teachers, the pressure from the outside world can bring religion to schools and prevent teaching evolution, etc. And the system seems difficult to improve from inside (been there, tried that).
Spreading rationality using fiction is another thing worth trying. There is a chance to attract a lot of people, make some of them more rational, and create a lot of utility. Or maybe despite there being dozens of rationalist fiction stories, they would all be read by the same people; unable to attract anyone outside of the chosen set. I don’t know.
The point is, if you are rational and you think the world would be better with more rational people… it’s one problem you can try to solve. So before Eliezer we had something like the Drake equation: how many people are rational × how many of them think making more people rational is the best action × how many of them think fiction is the best tool for that = almost zero. I am curious about the specific numbers; especially whether one of them is very close to zero, or whether it’s merely a few small numbers that give almost zero result when multiplied together.
I’d probably want more people who share my values than more rational people. Rational people who share my values is better. Rational people who don’t share my values would be the worst outcome.
I don’t think the school system was built by rationalists, so I’m not sure where you were going with that example.
How effective has fiction been in spreading other ideas compared to other methods?
I guess you don’t need to know anything from the HP canon. It could perhaps be even more interesting that way. I don’t think you would learn new information. It might have a better emotional impact, but that is difficult to predict.
I would consider the world better if there were more rational people sharing the same values as me. We could cooperate on mutual goals, and learn from each other.
Problem is, rational people don’t just appear randomly in the world. Okay, sometimes they do, but the process if far from optimal. If there is a chance to make rationality spread more reliable, we should try.
But we don’t exactly know how. We tried many things, with partial success. For example the school system—it is great in taking an illiterate peasant population and producing an educated population within a century. But it has some limits: students learn to guess their teachers’ passwords, there are not enough sufficiently skilled teachers, the pressure from the outside world can bring religion to schools and prevent teaching evolution, etc. And the system seems difficult to improve from inside (been there, tried that).
Spreading rationality using fiction is another thing worth trying. There is a chance to attract a lot of people, make some of them more rational, and create a lot of utility. Or maybe despite there being dozens of rationalist fiction stories, they would all be read by the same people; unable to attract anyone outside of the chosen set. I don’t know.
The point is, if you are rational and you think the world would be better with more rational people… it’s one problem you can try to solve. So before Eliezer we had something like the Drake equation: how many people are rational × how many of them think making more people rational is the best action × how many of them think fiction is the best tool for that = almost zero. I am curious about the specific numbers; especially whether one of them is very close to zero, or whether it’s merely a few small numbers that give almost zero result when multiplied together.
I’d probably want more people who share my values than more rational people. Rational people who share my values is better. Rational people who don’t share my values would be the worst outcome.
I don’t think the school system was built by rationalists, so I’m not sure where you were going with that example.
How effective has fiction been in spreading other ideas compared to other methods?