I believe the effect you describe exists, but I think there are two effects which make it unclear that implementing your suggestions is an overall benefit to the average reader. Firstly, to summarize your position:
Each extra weird belief you have detracts from your ability to spread other, perhaps more important, wierd memes. Therefore normal beliefs should be preferred to some extent, even when you expect them to be less correct or less locally useful on an issue, in order to improve your overall effectiveness at spreading your most highly valued memes.
If you have a cluster of beliefs which seem odd in general then you are more likely to share a “bridge” belief with someone. When you meet someone who shares at least one strange belief with you, you are much more likely to seriously consider their other beliefs because you share some common ground and are aware of their ability to find truth against social pressure. For example, an EA vegan may be vastly more able to introduce the other EA memes to a non-EA vegan than a EA non-vegan. Since almost all people have at least some weird beliefs, and those who have weird beliefs with literally no overlap with yours are likely to not be good targets for you to spread positive memes to, increasing your collection of useful and justifiable weird memes may well give you more opportunities to usefully spread the memes you consider most important
Losing the absolute focus on forming an accurate map by making concessions to popularity/not standing out in too many ways seems epistemologically risky and borderline dark arts. I do agree that some situations that not advertizing all your weirdness at once may be a useful strategic choice, but am very wary of the effect putting too much focus on this could have on your actual beliefs. You don’t want to strengthen your own absurdity heuristic by accident and miss out on more weird but correct and important things.
While I can imagine situations the advice given is correct (especially the for interacting with domain limited policymakers, or people you have a good read on likely reactions to extra weirdness), recommending it in general seems not sufficiently justified and I believe would have significant drawbacks.
Regarding point 2, while it would be epistemologically risky and borderline dark arts, I think the idea is more about what to emphasize and openly signal, not what to actually believe.
True, perhaps I should have been more clear in my dealing with the two, and explained how I think the they can blur across unintentionally. I do think being selective with signals can be instrumentally effective, but I think it’s important to be intentionally aware when you’re doing that and not allow your current mask to bleed over and influence your true beliefs unduly.
Essentially I’d like this post to come with a “Do this sometimes, but be careful and mindful of the possible changes to your beliefs caused by signaling as if you have different beliefs.” warning.
There is a definite likelihood that acting out a belief will cause you to believe it due to your brain poorly distinguishing signalling and true beliefs.
That can be advantageous at times. Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.
A willingness to do this is beneficial in most people who want to join organizations. They normally have a set of arbitrary rules on social conduct, dress, who to respect and who to respect less, how to deal with sickness and weakness, what media to watch, who to escalate issues to in the event of a conflict. If you don’t do this you’ll find it tricky gaining much power because people can spot people who fake these things.
Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.
No. I will make concessions about which beliefs to act on in order to optimize for “Goodness”, but I’m highly concerned about sacrificing beliefs about the world themselves. Doing this may be beneficial in specific situation, but at a cost to your overall effectiveness in other situations across domains. Since the range of possible situations that you might find yourself in is infinite, there is no way to know whether you’ve made a change to your model with catastrophic consequences down the line. Furthermore, we evaluate the effectiveness of strategies on the basis of the model we have, so every time your model becomes less accurate, your estimate of what is the best option in a given situation becomes less accurate. (Note that your confidence in your estimate may rise, fall, or stay the same, but I would doubt that having a less accurate model is going to lead to better credence calibration)
Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.
You can easily model beliefs and work out if they’re likely to have good or bad results. They could theoretically have a variety of infinite impacts, but most probably have a fairly small and limited effect. Humans have lots of beliefs, they can’t all have a major impact.
For the catastrophic consequences issue, have you read this?
The slippery slope issue of potentially catastrophic consequences from a model can be limited by establishing arbitrary lines before hand that you refuse to cross. Whether you should sacrifice your beliefs, like with Gandhi, depends on what the value given for said sacrifice is, how valuable your sacrifice is to your models, and what the likelihood of catastrophic failure is. You can swear an oath not to cross those lines, give valuable possessions to people to destroy if you cross those lines so you can heavily limit the chance of catastrophic failure.
Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.
Yeah, your success rate drops, but your ability to socialize can rise since irrational beliefs are how many think. If your irrational beliefs are of low importance, not likely to cause major issues, and unlikely to cause catastrophic failure they could be helpful.
I believe the effect you describe exists, but I think there are two effects which make it unclear that implementing your suggestions is an overall benefit to the average reader. Firstly, to summarize your position:
If you have a cluster of beliefs which seem odd in general then you are more likely to share a “bridge” belief with someone. When you meet someone who shares at least one strange belief with you, you are much more likely to seriously consider their other beliefs because you share some common ground and are aware of their ability to find truth against social pressure. For example, an EA vegan may be vastly more able to introduce the other EA memes to a non-EA vegan than a EA non-vegan. Since almost all people have at least some weird beliefs, and those who have weird beliefs with literally no overlap with yours are likely to not be good targets for you to spread positive memes to, increasing your collection of useful and justifiable weird memes may well give you more opportunities to usefully spread the memes you consider most important
Losing the absolute focus on forming an accurate map by making concessions to popularity/not standing out in too many ways seems epistemologically risky and borderline dark arts. I do agree that some situations that not advertizing all your weirdness at once may be a useful strategic choice, but am very wary of the effect putting too much focus on this could have on your actual beliefs. You don’t want to strengthen your own absurdity heuristic by accident and miss out on more weird but correct and important things.
While I can imagine situations the advice given is correct (especially the for interacting with domain limited policymakers, or people you have a good read on likely reactions to extra weirdness), recommending it in general seems not sufficiently justified and I believe would have significant drawbacks.
Regarding point 2, while it would be epistemologically risky and borderline dark arts, I think the idea is more about what to emphasize and openly signal, not what to actually believe.
True, perhaps I should have been more clear in my dealing with the two, and explained how I think the they can blur across unintentionally. I do think being selective with signals can be instrumentally effective, but I think it’s important to be intentionally aware when you’re doing that and not allow your current mask to bleed over and influence your true beliefs unduly.
Essentially I’d like this post to come with a “Do this sometimes, but be careful and mindful of the possible changes to your beliefs caused by signaling as if you have different beliefs.” warning.
There is a definite likelihood that acting out a belief will cause you to believe it due to your brain poorly distinguishing signalling and true beliefs.
That can be advantageous at times. Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.
A willingness to do this is beneficial in most people who want to join organizations. They normally have a set of arbitrary rules on social conduct, dress, who to respect and who to respect less, how to deal with sickness and weakness, what media to watch, who to escalate issues to in the event of a conflict. If you don’t do this you’ll find it tricky gaining much power because people can spot people who fake these things.
No. I will make concessions about which beliefs to act on in order to optimize for “Goodness”, but I’m highly concerned about sacrificing beliefs about the world themselves. Doing this may be beneficial in specific situation, but at a cost to your overall effectiveness in other situations across domains. Since the range of possible situations that you might find yourself in is infinite, there is no way to know whether you’ve made a change to your model with catastrophic consequences down the line. Furthermore, we evaluate the effectiveness of strategies on the basis of the model we have, so every time your model becomes less accurate, your estimate of what is the best option in a given situation becomes less accurate. (Note that your confidence in your estimate may rise, fall, or stay the same, but I would doubt that having a less accurate model is going to lead to better credence calibration)
Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.
Now, changing your values—that’s another story.
You can easily model beliefs and work out if they’re likely to have good or bad results. They could theoretically have a variety of infinite impacts, but most probably have a fairly small and limited effect. Humans have lots of beliefs, they can’t all have a major impact.
For the catastrophic consequences issue, have you read this?
http://lesswrong.com/lw/ase/schelling_fences_on_slippery_slopes/
The slippery slope issue of potentially catastrophic consequences from a model can be limited by establishing arbitrary lines before hand that you refuse to cross. Whether you should sacrifice your beliefs, like with Gandhi, depends on what the value given for said sacrifice is, how valuable your sacrifice is to your models, and what the likelihood of catastrophic failure is. You can swear an oath not to cross those lines, give valuable possessions to people to destroy if you cross those lines so you can heavily limit the chance of catastrophic failure.
Yeah, your success rate drops, but your ability to socialize can rise since irrational beliefs are how many think. If your irrational beliefs are of low importance, not likely to cause major issues, and unlikely to cause catastrophic failure they could be helpful.