We’d better start pushing emotional buttons and twisting the mental knobs of people if we want to get something done. Starting with our own.
This sounds awfully like endorsing the use of Dark Arts to counter the same. Not that I’d throw it out out of hand, but wouldn’t it be better to find a way to reduce the effectiveness of said arts to begin with? It seems to me that’s the primary purpose of most of the Sequences, in fact.
I think Tegmark’s claim is unequivocally that we should endorse Dark Artsy subsets of scientific knowledge to promote science and whatever needs promotion (rationality perhaps).
So yes, the thing being claimed is the thing you are emotionally inclined to fear/dislike. By him and by me.
Though just to be 100% sure, I’d like to have a brief description of your meaning of “dark arts” to avoid the double transparency fallacy.
The post is endorsing the use of the Dark Arts. From a purely deontological perspective, that’s objectionable. From a virtue ethics perspective, it could be seen as stooping (close) to the level of our enemies. From a consequentialist perspective, we need to compare the harm done by using them against the benefits.
To make that comparison, we need to determine what harm the Dark Arts, in and of themselves, cause. It seems to me (though I could certainly be convinced otherwise) that essentially all the harm they comes from their use in convincing people to believe falsehoods and to do stupid things. Does anyone have any significant examples of the Dark Arts being harmful independent of what they’re being used to convince people of?
Does anyone have any significant examples of the Dark Arts being harmful independent of what they’re being used to convince people of?
Dark Arts have externalities. Once you become known as a skilled manipulator fewer people are going to trust you and fewer people you can influence in the long run. Using Dart Arks is a Prisoner’s dilemma defection with all associated problems—a world full of Dark Artists is worse than a world full of honest truth sayers, ceteris paribus. Heavy use of Dark Arts may be risky for the performer himself and compromise his own rationality, as it is much easier to use a manipulative technique persuasively if one believes no deception is happening.
These aren’t actually examples, but it’s hard to come up with a specific example under “independent of what they’re being used to” clause.
Once you become known as a skilled manipulator fewer people are going to trust you and fewer people you can influence in the long run.
The very long run, perhaps.
In the shorter run of, say, 10-100 years, it isn’t in the least clear to me that the advantage of being considered (accurately or not) a skilled manipulator, in terms of the willingness of powerful agents to ally with me, is fully offset (let alone overpowered) by the disadvantage of it, in terms of people being less influenceable by me. Add to that the advantages of actually being a skilled manipulator, and that’s even less clear.
Admittedly, if I anticipate having a significantly longer effective lifespan than that, I may prefer not to risk it.
Once you become known as a skilled manipulator fewer people are going to trust you and fewer people you can influence in the long run.
But it seems that people who use the Dark Arts profit from it. If the Dark Arts were self-defeating as you suggest, we wouldn’t be having this discussion.
Using Dart Arks is a Prisoner’s dilemma defection with all associated problems—a world full of Dark Artists is worse than a world full of honest truth sayers, ceteris paribus.
Continuing to cooperate in a world where most players defect is a poor strategy. I also doubt that it strongly influences the defectors to stop defecting.
This sounds awfully like endorsing the use of Dark Arts to counter the same. Not that I’d throw it out out of hand, but wouldn’t it be better to find a way to reduce the effectiveness of said arts to begin with? It seems to me that’s the primary purpose of most of the Sequences, in fact.
I think Tegmark’s claim is unequivocally that we should endorse Dark Artsy subsets of scientific knowledge to promote science and whatever needs promotion (rationality perhaps). So yes, the thing being claimed is the thing you are emotionally inclined to fear/dislike. By him and by me.
Though just to be 100% sure, I’d like to have a brief description of your meaning of “dark arts” to avoid the double transparency fallacy.
The post is endorsing the use of the Dark Arts. From a purely deontological perspective, that’s objectionable. From a virtue ethics perspective, it could be seen as stooping (close) to the level of our enemies. From a consequentialist perspective, we need to compare the harm done by using them against the benefits.
To make that comparison, we need to determine what harm the Dark Arts, in and of themselves, cause. It seems to me (though I could certainly be convinced otherwise) that essentially all the harm they comes from their use in convincing people to believe falsehoods and to do stupid things. Does anyone have any significant examples of the Dark Arts being harmful independent of what they’re being used to convince people of?
Dark Arts have externalities. Once you become known as a skilled manipulator fewer people are going to trust you and fewer people you can influence in the long run. Using Dart Arks is a Prisoner’s dilemma defection with all associated problems—a world full of Dark Artists is worse than a world full of honest truth sayers, ceteris paribus. Heavy use of Dark Arts may be risky for the performer himself and compromise his own rationality, as it is much easier to use a manipulative technique persuasively if one believes no deception is happening.
These aren’t actually examples, but it’s hard to come up with a specific example under “independent of what they’re being used to” clause.
This is not what I have observed in practice.
The very long run, perhaps.
In the shorter run of, say, 10-100 years, it isn’t in the least clear to me that the advantage of being considered (accurately or not) a skilled manipulator, in terms of the willingness of powerful agents to ally with me, is fully offset (let alone overpowered) by the disadvantage of it, in terms of people being less influenceable by me. Add to that the advantages of actually being a skilled manipulator, and that’s even less clear.
Admittedly, if I anticipate having a significantly longer effective lifespan than that, I may prefer not to risk it.
But it seems that people who use the Dark Arts profit from it. If the Dark Arts were self-defeating as you suggest, we wouldn’t be having this discussion.
Continuing to cooperate in a world where most players defect is a poor strategy. I also doubt that it strongly influences the defectors to stop defecting.