It took me a long time to respond to this because I found the question resistant to analysis. My immediate impulse is to shout, “But, dammit, my rational argument is really truly actually valid and your bible quotation isn’t!” This is obviously irrelevant since, by hypothesis, my goal is to be convincing rather than correct.
After thinking about it, I’ve decided that the reason the question was hard to analyze is because that hypothesis is so false for me. You haven’t placed any constraints at all; in particular, you haven’t said that my goal is
to convince others to be more rational via a correct argument, or
to convince others to be more rational, provided that this is true, or
to convince others to be more rational, as long as I don’t have to harm anyone to do it, or
to convince others to be more rational, as long as I can maintain my integrity in the process.
If I take “convince others to be more rational” as a supergoal, then of course I should lie and brain-hack and do whatever is in my power to turn others into rationalists. But in reality, many of my highest values have less to do with the brain-states of others, than with what comes out of my own mouth. Turning others into rationalists at the price of becoming a charlatan myself would not be such a great tradeoff.
I regularly “lose” debates because I’m not willing to use rhetoric I personally find unconvincing. (Though I’m probably flattering myself to suppose that I would “win” otherwise.) To give a specific example, I am deeply opposed to drug prohibition, while openly predicting that more people will be addicted to drugs if they are legally available. This is a very difficult position to quickly relay to someone who doesn’t already agree with me, but any simplification would be basically dishonest. I could invent instrumental reasons why I shouldn’t use a basically dishonest argument in this case, but the truth is that I just hate lying to people, even more than I hate letting them walk around with false, deadly ideas.
I imagine Eliezer and Robin run into this themselves, when they say that a certain unusual medical procedure only has a small probability of success, but should be undergone anyway because the payoff is so high. Many people will hear “low probability of success” and stop listening, and many of those people will therefore die unnecessarily. Does this mean Eliezer and Robin should start saying that there is a high probability of success after all, in order to better save lives?
Now maybe your point here is that yes, we all should be lying for the sake of our goals—that we should throw out our rules of engagement and wage a total war on epistemic wrongness. I have considered this myself, and honestly I don’t have a good rebuttal. I can only say that I’m not ready to be that kind of warrior.
It took me a long time to respond to this because I found the question resistant to analysis. My immediate impulse is to shout, “But, dammit, my rational argument is really truly actually valid and your bible quotation isn’t!” This is obviously irrelevant since, by hypothesis, my goal is to be convincing rather than correct.
After thinking about it, I’ve decided that the reason the question was hard to analyze is because that hypothesis is so false for me. You haven’t placed any constraints at all; in particular, you haven’t said that my goal is
to convince others to be more rational via a correct argument, or
to convince others to be more rational, provided that this is true, or
to convince others to be more rational, as long as I don’t have to harm anyone to do it, or
to convince others to be more rational, as long as I can maintain my integrity in the process.
If I take “convince others to be more rational” as a supergoal, then of course I should lie and brain-hack and do whatever is in my power to turn others into rationalists. But in reality, many of my highest values have less to do with the brain-states of others, than with what comes out of my own mouth. Turning others into rationalists at the price of becoming a charlatan myself would not be such a great tradeoff.
I regularly “lose” debates because I’m not willing to use rhetoric I personally find unconvincing. (Though I’m probably flattering myself to suppose that I would “win” otherwise.) To give a specific example, I am deeply opposed to drug prohibition, while openly predicting that more people will be addicted to drugs if they are legally available. This is a very difficult position to quickly relay to someone who doesn’t already agree with me, but any simplification would be basically dishonest. I could invent instrumental reasons why I shouldn’t use a basically dishonest argument in this case, but the truth is that I just hate lying to people, even more than I hate letting them walk around with false, deadly ideas.
I imagine Eliezer and Robin run into this themselves, when they say that a certain unusual medical procedure only has a small probability of success, but should be undergone anyway because the payoff is so high. Many people will hear “low probability of success” and stop listening, and many of those people will therefore die unnecessarily. Does this mean Eliezer and Robin should start saying that there is a high probability of success after all, in order to better save lives?
Now maybe your point here is that yes, we all should be lying for the sake of our goals—that we should throw out our rules of engagement and wage a total war on epistemic wrongness. I have considered this myself, and honestly I don’t have a good rebuttal. I can only say that I’m not ready to be that kind of warrior.