Richard is quite right to point out that philosophers of mind are well aware of the counter arguments that Constant and Eliezer offer. And he is right to insist this is a subtle question to which a few quick comments do not do justice. There are however many philosophers who agree with Constant and Eliezer. See for example the October Philosophical Quarterly article on Anti-Zombies.
RobinHanson
TGGP, your question illustrates nicely my explanation for why more history than futurism.
This book review claims that the majority position in philosophy rejects the dualism Constant and Eliezer object to—this is most certainly not a dispute between philosophers and scientists.
People have been talking about assuming that states with many people hurt have a low (prior) probability. It might be more promising to assume that states with many people hurt have a low correlation with what any random person claims to be able to effect.
- Bayesian Adjustment Does Not Defeat Existential Risk Charity by 17 Mar 2013 8:50 UTC; 81 points) (
- 26 Jun 2014 3:03 UTC; 6 points) 's comment on A personal history of involvement with effective altruism by (
- 26 Aug 2010 14:17 UTC; 2 points) 's comment on The prior of a hypothesis does not depend on its complexity by (
Michael, your pig example threw me into a great fit of belly-laughing. I guess that’s what my mind look likes when it explodes. And I recall that was Marvin Minsky’s prediction in Society of Minds.
Well as long as we’ve gone to all the trouble to collect 85 comments on this topic, this seems like an great chance for a disagreement case study. It would be interesting to collect stats on who takes what side, and to relate that to their various kinds of relevant expertize. For the moment I disturbed by the fact that Eliezer and I seem to be in a minority here, but comforted a bit by the fact that we seem to know decision theory better than most. But I’m open to new data on the balance of opinion and the balance of relevant expertize.
In the weeks after 9/11, my colleague Roger Congleton, who had some expertize on terrorism, did a number of radio and other interviews where he argued that 9/11 was a unlucky aberration, and warned against overreacting. It wasn’t a message people wanted to hear then, and his being right early wins him nothing in today’s media game.
So can we learn to recognize the sound of a “cult cooler”, cooling down the cultishness, and distinguish it from a fake recording of such? Or at least invent a cultometer, so we can check our cultempature?
Unfortunately, it is only in a few rare technical areas where one can find anything like “full technical cases, with references” given to a substantial group “knowledgeable enough to understand all the technical arguments”, and it is even more rare that they actually bother to do so. Even when people appear to be giving such technical arguments to such knowledgeable audiences, the true is more often otherwise. For example, the arguments presented are often only a small fraction of what convinced someone to support a position.
Sometimes people attend too much to authority, and sometimes too little. I’m not sure I can discern an overall bias either way.
I fear the word “cult” obscures many difficult issues. I’m no fan of Rand-fandom, but I think it is important to identify as clearly as possible just what signs people within such a group could use to see they have a problem. For example, “ostracizing anyone who dared contradict her” would seemingly apply to a great many, perhaps the majority, of ordinary human organizations.
Are people biased on average to follow someone else, rather than to make their own path? It is not obvious to me. Yes, many great failings have come from groups dedicatedly following a leader. But surely many other failings have come from groups not dedicatedly following a leader.
I think I’ll side with the novices against Ougi here. The novices deserve a clearer answer than “think of ways to resolve your doubts” and “all will be clear when you try to use this stuff.” Cults usually tell people many things that are actually useful, and confronting leaders seems a reasonable way to resolve doubts. As I said before, the word “cult” is a bit too easy a word to throw around—I’d prefer a clearer description of what it means exactly and how to recognize one.
Chris, for a while I was a member of what most people would consider a “cult”, and from my experience “cults” usually teach people useful things. So it is a mistake to assure yourself you are not in a cult because you see that you are learning useful things.
Would jokes where Dilbert’s pointy-headed boss says idiotic things be less funny if the boss were replaced by a co-worker? If so, does that suggest bosses are Hated Enemies, and Dilbert jokes bring false laughter?
The crazier a thing you believe as a result of trusting your community, the stronger a tie to your community that shows. So when we signal loyalty via beliefs, those beliefs can get pretty crazy.
I wonder: by introducing dissenters can one get people to disagree too much with the majority?
In addition to suffering social disapproval when they first make their contrary claims, the lonely dissenter should realize that even if they are eventually proven right, they will likely still lose socially compared to if they had not so dissented.
I think it would be well worth it for Eliezer, myself, or contributors here to write a post explaining why they still embrace some particular “weird” belief in the face of wide disagreement. Such a post should not just review the positive reasons for the belief, but also the reasons for thinking others to be mistaken, even when they have heard and rejected those reasons.
Unknown, good summary.
Carl, yes, but in addition you probably infer that someone on the other side is a bit less rational that someone on your side.
Caledonian, “those wishing to overcome their own biases … can distinguish between … wishful thought, and genuinely strong arguments” sounds to me more like wishful thought.
Overcoming, I do hope we at least rise above the DeLong standard.
I agree with Richard that we should respect the fact that philosophers have spilled a lot of ink on the consciousness question; we should read them and respond to their arguments. We should have at least one post devoted to this topic. But after doing so, I’m betting I’ll still mainly agree with Eliezer.
Richard, I don’t think Eliezer conflated reasoning with observing your own brain—he just suggested that simple Bayesian reasoning based on observing your own brain gets you pretty much all the conclusions you need from most other “reasoning.”