Upvoted. I think this is a useful way to think about things like this. Compartmentalizing and decompartmentalizing aren’t completely wrong, but are wrongly applied in different contexts. So part of the challenge is to convince the person you’re talking to that it’s safe to decompartmentalize in the realm needed to see what you are talking about.
For example, it took me quite some time to decompartmentalize on evolution versus biology because I had a distrust of evolution. It looked like toxic waste to me, and indeed has arguably generated some (social darwinism, e.g.). People who mocked creationists actually contributed to my sense of distrust in the early stages, given that my subjective experience with (young-earth) creationists was not of particularly unintelligent or gullible people. However this got easier when I learned more biology and could see the reference points, and the vacuum of solid evidence (as opposed to reasonable-sounding speculation) for creationism. Later the creationist speculation started sounding less reasonable and the advocates a bit more gullible—but until I started making the connections from evolution to the rest of science, there wasn’t reason for these things to be on my map yet.
I’m starting to think arguments for cryonics should be presented in the form of “what are the rational reasons to decompartmentalize (or not) on this?” instead of “just shut up and decompartmentalize!” It takes time to build trust, and folks are generally justifiably skeptical when someone says “just trust me”. Also it is a quite valid point that topics like death and immortality (not to mention futurism, etc.) are notorious for toxic waste to begin with.
ciphergoth and I talked about cryonics a fair bit a couple of nights ago. He posits that I will not sign up for cryonics until it is socially normal. I checked my internal readout and it came back “survey says you’re right” and nodded my head. I surmise this is what it will take in general.
(The above is the sort of result my general memetic defence gives. Possibly-excessive conservatism in actually buying an idea.)
So that’s your whole goal. How do you make cryonics normal without employing the dark arts?
I think some additional training in DADA would do me a lot of good here. That is, I don’t want to be using the dark arts, but I don’t want to be vulnerable to them either. And dark arts is extremely common, especially when people are looking for excuses to keep on compartmentalizing something.
A contest for bored advertising people springs to mind: “How would you sell cryonics to the public?” Then filter the results that use dark arts. This will produce better ideas than you ever dreamed.
The hard part of this plan is making it sound like fun for the copywriters. Ad magazine competition? That’s the sort of thing that gets them working on stuff for fun and kudos.
(My psychic powers predict approximately 0 LessWrong regulars in the advertising industry. I hope I’m wrong.)
(And no, I don’t think b3ta is quite what we’re after here.)
Upvoted. I think this is a useful way to think about things like this. Compartmentalizing and decompartmentalizing aren’t completely wrong, but are wrongly applied in different contexts. So part of the challenge is to convince the person you’re talking to that it’s safe to decompartmentalize in the realm needed to see what you are talking about.
For example, it took me quite some time to decompartmentalize on evolution versus biology because I had a distrust of evolution. It looked like toxic waste to me, and indeed has arguably generated some (social darwinism, e.g.). People who mocked creationists actually contributed to my sense of distrust in the early stages, given that my subjective experience with (young-earth) creationists was not of particularly unintelligent or gullible people. However this got easier when I learned more biology and could see the reference points, and the vacuum of solid evidence (as opposed to reasonable-sounding speculation) for creationism. Later the creationist speculation started sounding less reasonable and the advocates a bit more gullible—but until I started making the connections from evolution to the rest of science, there wasn’t reason for these things to be on my map yet.
I’m starting to think arguments for cryonics should be presented in the form of “what are the rational reasons to decompartmentalize (or not) on this?” instead of “just shut up and decompartmentalize!” It takes time to build trust, and folks are generally justifiably skeptical when someone says “just trust me”. Also it is a quite valid point that topics like death and immortality (not to mention futurism, etc.) are notorious for toxic waste to begin with.
ciphergoth and I talked about cryonics a fair bit a couple of nights ago. He posits that I will not sign up for cryonics until it is socially normal. I checked my internal readout and it came back “survey says you’re right” and nodded my head. I surmise this is what it will take in general.
(The above is the sort of result my general memetic defence gives. Possibly-excessive conservatism in actually buying an idea.)
So that’s your whole goal. How do you make cryonics normal without employing the dark arts?
Hang out with cryonicists all the time!
Mike Darwin had a funny idea for that. :)
I think some additional training in DADA would do me a lot of good here. That is, I don’t want to be using the dark arts, but I don’t want to be vulnerable to them either. And dark arts is extremely common, especially when people are looking for excuses to keep on compartmentalizing something.
A contest for bored advertising people springs to mind: “How would you sell cryonics to the public?” Then filter the results that use dark arts. This will produce better ideas than you ever dreamed.
The hard part of this plan is making it sound like fun for the copywriters. Ad magazine competition? That’s the sort of thing that gets them working on stuff for fun and kudos.
(My psychic powers predict approximately 0 LessWrong regulars in the advertising industry. I hope I’m wrong.)
(And no, I don’t think b3ta is quite what we’re after here.)