The nearest reason why I want people around me to become more rational is because irrationality (in some specific forms) repels me. I admit this is how being a member of a tribe can feel from inside. (In a parallel branch of the multiverse I could be a theist, repelled by atheists. I mean, how could you not dislike the people who throw away infinities of utilons, only because they are so overconfident about their human reasoning abilities, which outside view suggests are pretty pathetic.)
But I also believe that having a higher sanity waterline is a good thing. With a specific person, sabotaging their rationality to exploit them may sometimes bring me more utilons than cooperating with them. But what about a population as a whole? I enjoy having higher standards of living, I enjoy having internet, I enjoy having the possibility of hearing different opinions and not having to follow religious leaders. I would enjoy even more if driverless cars became commonplace, if medicine could make us live even better and longer, and if psychology could help significantly beyond the placebo effect. All these things require some general standard of rationality. -- We often complain how low that level is, so for the sake of fairness I would like to note that it could be even much lower. Imagine a society where every problem is solved by asking a local shaman, and a typical answer is that a problem was caused by a witch, and you must kill the witch to fix the problem. And if you somehow step out of the line, you become the best candidate for a witch. Some humans live like this, too. -- If only during the recent century all the money and energy spent on horoscopes would be spent on medicine instead, maybe 100 years could be now the average lifespan, and 150 years rather likely for those who take care to exercise and avoid sugar. Think about all other improvements we could get if only people became more rational. (We would get some new harmful things, too.)
I agree that even if I feel that people should become more rational, trying to correct them is probably not the best way, and quite often it does more harm than good. (I mean harm to the person who wastes their time trying to correct others. Waste of time, and frustration.) I used to spend a lot of time correcting people online. Finding LessWrong helped me a lot; now that I know there is one website where people can discuss rationally, the existence of others feels less painful. It also helped to realize that inferential distances are too big to be overcome by a comment in a discussion. I feel certain in many situations that I know better than other people, but I have updated my estimate of fixing their reasoning to near epsilon. (Unless the other person specifically asks to be fixed, which almost never happens.) Writing a blog or starting a local rationalist group would be better. (I just need to overcome my akrasia.)
So, instead of doing stupid stuff that feels good, if we agree that having more rationalists on this planet is a good idea, what next? I know that CFAR is doing workshops for a few dozens of participants. The LessWrong blog is here, available for everyone. That is already pretty awesome, but it is unlikely that it is the best thing that could be done. What else could have a higher impact?
My ideas in five minutes—write a book about rationality (books have higher status than blogs, can be read by people who don’t procrastinate online); create a “LessWrong for dummies” website (obviously with a different name) explaining the uncontroversial LW/CFAR topics to a public in a simplified form. Actually, we could start with the website and then publish it as a book. But it needs a lot of time and talent. Alternative idea: do something to impress the general population and make rationality more fashionable (moderate use of Dark Arts allowed); for example organize a discussion about rationality on a university with rationalists who also happen to be millionaires (or otherwise high status), and minicamp-style exercises for participants as a followup. Requires the rationalist celebrities and someone to do the exercises.
If only during the recent century all the money and energy spent on horoscopes would be spent on medicine instead, maybe 100 years could be now the average lifespan, and 150 years rather likely for those who take care to exercise and avoid sugar.
I don’t think enough has been spent on horoscopes to do that much good. On the other hand, if people gave up on lotteries, that might have some impact.
I agree that figuring out how to teach rationality to people with average intelligence is an important goal, even if “Thinking Clearly for Dummies” is an amusing title.
One idea is to produce entertainment with rationalist themes (books, movies, TV shows). Methods is a good start, but much more could be done here. Not sure if anyone’s working on stuff like this though. Hopefully a past or future workshop participant will get on this.
“More rationalists” seems like slightly the wrong way to think about the goal here, which to me should really be to increase the power of the rationalist community. This doesn’t imply going after large gains in numbers so much as going after powerful people. For example, in a conversation I had at the workshop it was pointed out to me that some governmental agency that determines funding for medical research doesn’t classify aging as a disease, so researchers can’t get certain kinds of funds for aging research (I may be misremembering the details here but that was the gist of it). The easiest way I can think of to fix this problem is for the rationalist community to have friends in politics. I don’t know if that’s currently the case.
So we should focus on increasing our social skills, with the specific goal of befriending influential people, and influence politics. Without officially becoming politicians ourselves, because that messes with one’s brain. Unless we consciously decide to sacrifice a few of us.
Can we agree on which political goals would be desirable? Funding for aging research seems like a good candidate. (Even a libertarial opposed to any kind of taxation and government spending could agree that assuming the government already takes the money and spends it anyway, it is better if the money is spent on aging research, as opposed to research of rare diseases of cute puppies.) Opposing obvious stupidities could be other thing. Unfortunately, no politician can become popular by doing nothing else but opposing obvious stupidities, although I personally would love to see more such politicians.
Then we would need a proper protocol on sacrificing rationalists for politics. A rationalist who becomes a politician could fix a lot of things, but inevitably they would stop being a rationalist. I guess it is impossible to keep a functioning mind… and even if by some miracle one could do it, then they could not discuss their opinions and goals on LW openly anyway.
Actually, a LW community could easily ruin a rational politician’s career by asking them questions where a honest answer means political suicide, but a less-than-honest answer is easily disproved by the community. Imagine a Prisonners’ Dilemma among politicians where two politicians agree to support each other’s ideas for mutual benefit. Each of them dislikes the other idea, but considers the world with both of them better than the world with none of them. But for the plan to work, both of the politicians must pretend to support both ideas wholeheartedly. And now the LW community would openly ask the former rationalist politician about the other idea, and present their own speculations about the motives; an saying “please, for greater utility, let’s not discuss this” would probably have the opposite effect.
So there would need to be some firewall between the politician and the community. For example that the politician discusses with the community only in specific articles, where it is allowed to discuss only the explicitly allowed topics. (You could send the politician a PM suggesting a new topic, but you would be forbidden to say publicly that you did so.)
So we should focus on increasing our social skills, with the specific goal of befriending influential people, and influence politics. Without officially becoming politicians ourselves, because that messes with one’s brain. Unless we consciously decide to sacrifice a few of us.
I cannot determine whether this is presented ironically.
Politics is the mindkiller. But if rational people refuse to participate in politics, then all policy will be decided by irrational people, which is not good.
As the linked articles says, Bayesians should not lose against Barbarians. Rationalists should win; not invent clever rationalizations for losing. We should one-box in Newcomb’s Problem, instead of complaining that the choice is unfair against our preconceptions of rationality.
I don’t want to ever hear this: “Eliezer told me that politics is the mindkiller, so I refused to participate in politics, and now my children learn mandatory religion and creationism at school, cryonics and polyamory are illegal, the AI research is focused on creating a supermachine that believes in god and democracy… and it all sucks, but my duty as a rationalist was to avoid politics, and I followed my duty.”
So what is the solution?
Learn to influence the politics while protecting yourself from most of the mindkilling. If that turns out to be impossible or very ineffective, then select a group of people who will use their rationality to become skilled politicians and shape the society towards greater utility, even if they lose their rationality in the process… and be prepared to deal with this loss. Be prepared for a moment when you have to say to the given person “we don’t consider you rational anymore” or even “supporting you now would make the world worse”. The idea is that the person should make the world better (compared with someone else getting the office) before this happens. We should evaluate carefully how much likely it is for the specific person; perhaps make some preparations to increase the likelihood.
It is also perhaps useful to distinguish between “talk about politics in unfocused gatherings with large undifferentiated groups of people,” “talk about politics in focused gatherings with selected groups of people,” and “take steps to affect policy.” It might turn out that there are good reasons to avoid politics in the first case while not avoiding it all in the latter two.
It’s probably not so much the mandatory tribalism that makes people apathetic to working with politics, but more like the thing in this Moldbug quote via patrissimo:
You’re trying to replace Windows with Linux. Great.
Your way of replacing Windows with Linux: install Linux as a set of Word macros, one macro at a time. (You’d need something like Emscripten for Word macro.) Oh, also—Linux doesn’t exist. So you’re actually building Linux as a set of Word macros, one macro at a time. Oh, and you have no distribution mechanism. Your users need to type in the macros themselves.
Are the Word users fed up with Word? Oh, man. They’ve had it up to here with Word. So what?
Tech-minded people want to solve problems. They look at politics and see a lifetime of staring at countless problems while stuck in a system that will let them solve almost none of them and being barraged with an endless stream of trivial annoyances.
Wouldn’t it be easier to use your rationality to amass huge amounts of wealth, then simply buy whatever politicians you need, just like other rich people do ?
I don’t know how much control rich people really have over politicians.
When someone becomes a successful politician, they have means to get money. The more money they have, the more it costs to buy them. And they probably get different offers from different rich people, sometimes wanting them to do contradictory things, so they can choose to accept bribes compatible with their own opinions.
Also I suspect that when you have enough money, more money becomes meaningless, and the real currency is the power to influence each other. For example, if you already have 10 billlion dollars, instead of even another 50 billion dollars you would prefer a “friend” who can get you out of jail free if that ever becomes necessary. So maybe from some level higher, you have to hold an office to be able to provide something valuable to others who hold an office.
But if having enough money really is enough to improve the world, then certainly, we should do that.
Well, firstly, you don’t need to buy a whole politician (though it doesn’t hurt); you only need to buy the legislation you need. Thus you don’t care how your politician votes on gay marriage or veteran’s benefits or whatever, as long he is voting for Bill #1234567.8, which you sponsored, and which deals with protecting squirrel habitats (because you really like squirrels, just for example). This is good, because it’s not enough to just have one politician, you need a bunch of them, and it’s cheaper to just buy their votes piecemeal.
Secondly, you are of course correct about politicians getting money from different sources, but hey, that’s the free market for you. On the other hand, politicians aren’t really all that rich. Sure, they may be millionaires, and a few might be billionaires, but the $50e9 figure that you mentioned would be unimaginable to any of them. If you really had that much money (and were smart about using it), you would be able to buy not merely a single politicians, but entire committees, wholesale.
The nearest reason why I want people around me to become more rational is because irrationality (in some specific forms) repels me. I admit this is how being a member of a tribe can feel from inside. (In a parallel branch of the multiverse I could be a theist, repelled by atheists. I mean, how could you not dislike the people who throw away infinities of utilons, only because they are so overconfident about their human reasoning abilities, which outside view suggests are pretty pathetic.)
But I also believe that having a higher sanity waterline is a good thing. With a specific person, sabotaging their rationality to exploit them may sometimes bring me more utilons than cooperating with them. But what about a population as a whole? I enjoy having higher standards of living, I enjoy having internet, I enjoy having the possibility of hearing different opinions and not having to follow religious leaders. I would enjoy even more if driverless cars became commonplace, if medicine could make us live even better and longer, and if psychology could help significantly beyond the placebo effect. All these things require some general standard of rationality. -- We often complain how low that level is, so for the sake of fairness I would like to note that it could be even much lower. Imagine a society where every problem is solved by asking a local shaman, and a typical answer is that a problem was caused by a witch, and you must kill the witch to fix the problem. And if you somehow step out of the line, you become the best candidate for a witch. Some humans live like this, too. -- If only during the recent century all the money and energy spent on horoscopes would be spent on medicine instead, maybe 100 years could be now the average lifespan, and 150 years rather likely for those who take care to exercise and avoid sugar. Think about all other improvements we could get if only people became more rational. (We would get some new harmful things, too.)
I agree that even if I feel that people should become more rational, trying to correct them is probably not the best way, and quite often it does more harm than good. (I mean harm to the person who wastes their time trying to correct others. Waste of time, and frustration.) I used to spend a lot of time correcting people online. Finding LessWrong helped me a lot; now that I know there is one website where people can discuss rationally, the existence of others feels less painful. It also helped to realize that inferential distances are too big to be overcome by a comment in a discussion. I feel certain in many situations that I know better than other people, but I have updated my estimate of fixing their reasoning to near epsilon. (Unless the other person specifically asks to be fixed, which almost never happens.) Writing a blog or starting a local rationalist group would be better. (I just need to overcome my akrasia.)
So, instead of doing stupid stuff that feels good, if we agree that having more rationalists on this planet is a good idea, what next? I know that CFAR is doing workshops for a few dozens of participants. The LessWrong blog is here, available for everyone. That is already pretty awesome, but it is unlikely that it is the best thing that could be done. What else could have a higher impact?
My ideas in five minutes—write a book about rationality (books have higher status than blogs, can be read by people who don’t procrastinate online); create a “LessWrong for dummies” website (obviously with a different name) explaining the uncontroversial LW/CFAR topics to a public in a simplified form. Actually, we could start with the website and then publish it as a book. But it needs a lot of time and talent. Alternative idea: do something to impress the general population and make rationality more fashionable (moderate use of Dark Arts allowed); for example organize a discussion about rationality on a university with rationalists who also happen to be millionaires (or otherwise high status), and minicamp-style exercises for participants as a followup. Requires the rationalist celebrities and someone to do the exercises.
I don’t think enough has been spent on horoscopes to do that much good. On the other hand, if people gave up on lotteries, that might have some impact.
I agree that figuring out how to teach rationality to people with average intelligence is an important goal, even if “Thinking Clearly for Dummies” is an amusing title.
One idea is to produce entertainment with rationalist themes (books, movies, TV shows). Methods is a good start, but much more could be done here. Not sure if anyone’s working on stuff like this though. Hopefully a past or future workshop participant will get on this.
“More rationalists” seems like slightly the wrong way to think about the goal here, which to me should really be to increase the power of the rationalist community. This doesn’t imply going after large gains in numbers so much as going after powerful people. For example, in a conversation I had at the workshop it was pointed out to me that some governmental agency that determines funding for medical research doesn’t classify aging as a disease, so researchers can’t get certain kinds of funds for aging research (I may be misremembering the details here but that was the gist of it). The easiest way I can think of to fix this problem is for the rationalist community to have friends in politics. I don’t know if that’s currently the case.
So we should focus on increasing our social skills, with the specific goal of befriending influential people, and influence politics. Without officially becoming politicians ourselves, because that messes with one’s brain. Unless we consciously decide to sacrifice a few of us.
Can we agree on which political goals would be desirable? Funding for aging research seems like a good candidate. (Even a libertarial opposed to any kind of taxation and government spending could agree that assuming the government already takes the money and spends it anyway, it is better if the money is spent on aging research, as opposed to research of rare diseases of cute puppies.) Opposing obvious stupidities could be other thing. Unfortunately, no politician can become popular by doing nothing else but opposing obvious stupidities, although I personally would love to see more such politicians.
Then we would need a proper protocol on sacrificing rationalists for politics. A rationalist who becomes a politician could fix a lot of things, but inevitably they would stop being a rationalist. I guess it is impossible to keep a functioning mind… and even if by some miracle one could do it, then they could not discuss their opinions and goals on LW openly anyway.
Actually, a LW community could easily ruin a rational politician’s career by asking them questions where a honest answer means political suicide, but a less-than-honest answer is easily disproved by the community. Imagine a Prisonners’ Dilemma among politicians where two politicians agree to support each other’s ideas for mutual benefit. Each of them dislikes the other idea, but considers the world with both of them better than the world with none of them. But for the plan to work, both of the politicians must pretend to support both ideas wholeheartedly. And now the LW community would openly ask the former rationalist politician about the other idea, and present their own speculations about the motives; an saying “please, for greater utility, let’s not discuss this” would probably have the opposite effect.
So there would need to be some firewall between the politician and the community. For example that the politician discusses with the community only in specific articles, where it is allowed to discuss only the explicitly allowed topics. (You could send the politician a PM suggesting a new topic, but you would be forbidden to say publicly that you did so.)
I cannot determine whether this is presented ironically.
Completely seriously.
Politics is the mindkiller. But if rational people refuse to participate in politics, then all policy will be decided by irrational people, which is not good.
As the linked articles says, Bayesians should not lose against Barbarians. Rationalists should win; not invent clever rationalizations for losing. We should one-box in Newcomb’s Problem, instead of complaining that the choice is unfair against our preconceptions of rationality.
I don’t want to ever hear this: “Eliezer told me that politics is the mindkiller, so I refused to participate in politics, and now my children learn mandatory religion and creationism at school, cryonics and polyamory are illegal, the AI research is focused on creating a supermachine that believes in god and democracy… and it all sucks, but my duty as a rationalist was to avoid politics, and I followed my duty.”
So what is the solution?
Learn to influence the politics while protecting yourself from most of the mindkilling. If that turns out to be impossible or very ineffective, then select a group of people who will use their rationality to become skilled politicians and shape the society towards greater utility, even if they lose their rationality in the process… and be prepared to deal with this loss. Be prepared for a moment when you have to say to the given person “we don’t consider you rational anymore” or even “supporting you now would make the world worse”. The idea is that the person should make the world better (compared with someone else getting the office) before this happens. We should evaluate carefully how much likely it is for the specific person; perhaps make some preparations to increase the likelihood.
It is also perhaps useful to distinguish between “talk about politics in unfocused gatherings with large undifferentiated groups of people,” “talk about politics in focused gatherings with selected groups of people,” and “take steps to affect policy.” It might turn out that there are good reasons to avoid politics in the first case while not avoiding it all in the latter two.
It’s probably not so much the mandatory tribalism that makes people apathetic to working with politics, but more like the thing in this Moldbug quote via patrissimo:
Tech-minded people want to solve problems. They look at politics and see a lifetime of staring at countless problems while stuck in a system that will let them solve almost none of them and being barraged with an endless stream of trivial annoyances.
Wouldn’t it be easier to use your rationality to amass huge amounts of wealth, then simply buy whatever politicians you need, just like other rich people do ?
I don’t know how much control rich people really have over politicians.
When someone becomes a successful politician, they have means to get money. The more money they have, the more it costs to buy them. And they probably get different offers from different rich people, sometimes wanting them to do contradictory things, so they can choose to accept bribes compatible with their own opinions.
Also I suspect that when you have enough money, more money becomes meaningless, and the real currency is the power to influence each other. For example, if you already have 10 billlion dollars, instead of even another 50 billion dollars you would prefer a “friend” who can get you out of jail free if that ever becomes necessary. So maybe from some level higher, you have to hold an office to be able to provide something valuable to others who hold an office.
But if having enough money really is enough to improve the world, then certainly, we should do that.
Well, firstly, you don’t need to buy a whole politician (though it doesn’t hurt); you only need to buy the legislation you need. Thus you don’t care how your politician votes on gay marriage or veteran’s benefits or whatever, as long he is voting for Bill #1234567.8, which you sponsored, and which deals with protecting squirrel habitats (because you really like squirrels, just for example). This is good, because it’s not enough to just have one politician, you need a bunch of them, and it’s cheaper to just buy their votes piecemeal.
Secondly, you are of course correct about politicians getting money from different sources, but hey, that’s the free market for you. On the other hand, politicians aren’t really all that rich. Sure, they may be millionaires, and a few might be billionaires, but the $50e9 figure that you mentioned would be unimaginable to any of them. If you really had that much money (and were smart about using it), you would be able to buy not merely a single politicians, but entire committees, wholesale.