I think the real reason people are reluctant to sign up for cryonics is that it raises all the classic red flags for a scam. People are asked to invest a significant amount of money in a non-mainstream plan that, even if it works exactly as claimed, won’t pay off until well after the mark is no longer able to sue.
Consider the situation from the perspective of someone just hearing about cryonics for the first time: if cryonics is a sham, then they should expect to be presented with a lot of generally plausible-sounding (but falsified) evidence, most of which will probably be too technical for them to follow (so that they can’t figure out that it’s bogus).
I would agree—but then i read the comment after yours, by TobyBartels. And, of course, as lsparrish mentioned, there seems to be cultural opposition to it in much literature. Pragmatic concerns like “is this organization legitimate” would not make people deny wanting to live.
Not in itself, no, but once they had already generated that metabelief for unrelated reasons, it would be very easy to reuse the meme for deflecting scammers.
Something deeper is involved. Otherwise they would just do the research required to figure out the answer (that it’s not a scam) and sign up.
Also, I’ve seen relatively high status people who have relevant background dismissing this in ridiculous ways (e.g. claiming it is no better than mummification, or “turning hamburger into a cow”). I don’t think they would risk their reputation without doing the research if there wasn’t some kind of serious curiosity-stopping bias going on.
Most people don’t have the scientific literacy to distinguish good science from bad with any amount of research, and know this. Even if you had some reason to think that this probably-a-scam thing had enough of a chance to be worth expending nontrivial effort to check out, it’s hard to distinguish a good scam from a weird truth.
Once someone’s decided (unconsciously, most likely) that they want to avoid cryonics because it seems sketchy, they will generate excuses to avoid it without having to actually say outright that they think it’s a scam, because accusing people without hard (scientific, not Bayesian) evidence of guilt is socially inappropriate.
I’m not sure about the high status people with relevant background information, though I do note that that demographic seems more likely per capita to actually sign up. However, I would speculate that academics may be leery of being seen to take seriously anything “fringy” that might damage their reputation, and will endorse any line of argument that appears to oppose it.
I think this doesn’t explain nearly as much as you think it does. There are only a couple hundred people on the planet actually signed up for cryonics. There are plenty of “sciency” scams which have attracted more people and more money (per person). Why doesn’t cryonics attract those sorts of fools?
Actual scams are designed to make the scammer money, and the scammer therefore has the funds to make the scam more effective. Cryonics, having been optimized for a different goal, resembles a scam only accidentally and therefore only somewhat; it is most like an underfunded, incompetently run scam.
Should we therefore encourage cryonics companies to mimic the tactics of scammers so as to make cryonics more popular, so as to make it less expensive for individual people to sign up?
Or in other words: maybe instead of donating to SIAI, we should donate to a fund to hire a good PR agency for a cryo corporation.
Should we therefore encourage cryonics companies to mimic the tactics of scammers so as to make cryonics more popular, so as to make it less expensive for individual people to sign up?
Perhaps because it isn’t built to? Indeed, “sciency” scams often rely on a lot of woo, whereas cryonics appears to require getting past the “soul” idea...
And yet… reading the list, I don’t so much get the impression that cryonics is unscamlike as that it belongs to a different genre of scam.
But I notice that I am rationalizing, and I need to go update.
ETA: No, I see now. Cryonics resembles-in-genre a religion. If you follow a certain burial rite, you will have eternal life in a better world. People generate religious objections: they say that it is morally wrong, that it destroys the immortal soul. People treat cryonics as though accepting it as valid would require them to give up their religious beliefs, even if those beliefs are actually compatible with cryonics.
Furthermore, cryonics doesn’t sell itself as a religion: it doesn’t claim to have answers to the great terrible questions that unsettle the mind. So people looking for a new religion tend not to choose it.
This leaves open the question of why cryonics is uncommon among self-professed atheists. Do so few “unbelievers” truly disbelieve?
I strongly suspect that it is more common per-capita among atheists than theists. If that is so, it suggests that maybe cryonics is fooling some atheists by setting off their religion-alarms, and/or the like-a-religion objection is only one of a suite of reasons why cryonics is unpopular.
The fact that cryonics is becoming more, not less, common is (weak) evidence that there’s good reasoning behind it; this evidence can be improved by noting that most irrational fast-growing fringe movements (i.e. Jehovah’s Witnesses) achieve their growth via making members afraid that they will lose out if they don’t evangelize. Cryonics doesn’t have that dynamic†.
† Even though cryonics would be cheaper if it were more popular, that’s more of a group coordination problem than an urgent personal incentive. I don’t see a lot of cryonics advocates feeling pressured to evangelize for it, just a lot of people who happen to think that they’re obviously right on the issue.
As was pointed out elsewhere in this thread, the absurdity heuristic alone doesn’t explain why cryonics is significantly less common than, say, Raëlism.
I don’t know the cause or cure, but I think geeks tend to be lousy at publicity.
Tentative theory—they’re independent-minded enough that they can’t really model people who want a little pixie dust (aka status, supernormal stimuli, or fantasies of value) sprinkled on things. Alternate theory: geeks like pixie dust, too, but it’s a different sort of pixie dust.
Yes, because those are the red flags associated with successful scams. Cryonics retains the creep-out factors, but misses the niche of effective marketing.
I think the real reason people are reluctant to sign up for cryonics is that it raises all the classic red flags for a scam. People are asked to invest a significant amount of money in a non-mainstream plan that, even if it works exactly as claimed, won’t pay off until well after the mark is no longer able to sue.
Consider the situation from the perspective of someone just hearing about cryonics for the first time: if cryonics is a sham, then they should expect to be presented with a lot of generally plausible-sounding (but falsified) evidence, most of which will probably be too technical for them to follow (so that they can’t figure out that it’s bogus).
I would agree—but then i read the comment after yours, by TobyBartels. And, of course, as lsparrish mentioned, there seems to be cultural opposition to it in much literature. Pragmatic concerns like “is this organization legitimate” would not make people deny wanting to live.
Not in itself, no, but once they had already generated that metabelief for unrelated reasons, it would be very easy to reuse the meme for deflecting scammers.
Something deeper is involved. Otherwise they would just do the research required to figure out the answer (that it’s not a scam) and sign up.
Also, I’ve seen relatively high status people who have relevant background dismissing this in ridiculous ways (e.g. claiming it is no better than mummification, or “turning hamburger into a cow”). I don’t think they would risk their reputation without doing the research if there wasn’t some kind of serious curiosity-stopping bias going on.
Most people don’t have the scientific literacy to distinguish good science from bad with any amount of research, and know this. Even if you had some reason to think that this probably-a-scam thing had enough of a chance to be worth expending nontrivial effort to check out, it’s hard to distinguish a good scam from a weird truth.
Once someone’s decided (unconsciously, most likely) that they want to avoid cryonics because it seems sketchy, they will generate excuses to avoid it without having to actually say outright that they think it’s a scam, because accusing people without hard (scientific, not Bayesian) evidence of guilt is socially inappropriate.
I’m not sure about the high status people with relevant background information, though I do note that that demographic seems more likely per capita to actually sign up. However, I would speculate that academics may be leery of being seen to take seriously anything “fringy” that might damage their reputation, and will endorse any line of argument that appears to oppose it.
It’s not just scientific literacy though: the “hamburger into a cow” line comes from Society for Cryobiology fellow Arthur W. Rowe.
I think this doesn’t explain nearly as much as you think it does. There are only a couple hundred people on the planet actually signed up for cryonics. There are plenty of “sciency” scams which have attracted more people and more money (per person). Why doesn’t cryonics attract those sorts of fools?
Actual scams are designed to make the scammer money, and the scammer therefore has the funds to make the scam more effective. Cryonics, having been optimized for a different goal, resembles a scam only accidentally and therefore only somewhat; it is most like an underfunded, incompetently run scam.
Should we therefore encourage cryonics companies to mimic the tactics of scammers so as to make cryonics more popular, so as to make it less expensive for individual people to sign up?
Or in other words: maybe instead of donating to SIAI, we should donate to a fund to hire a good PR agency for a cryo corporation.
While that may outperform the current approach, it is almost surely not the best of all possible approaches.
You won’t be revived if everyone is turned into paperclips first.
Perhaps because it isn’t built to? Indeed, “sciency” scams often rely on a lot of woo, whereas cryonics appears to require getting past the “soul” idea...
more like a couple thousand, actually; 200 preserved to day http://en.wikipedia.org/wiki/Cryonics, 424 funded members with contracts in CI only
Still very few, though.
Here’s a list of classic red flags for a scam—first on I found Googling for “scam red flags”. Not sure there are that many matches with cryonics.
Yeah, that’s a good point.
And yet… reading the list, I don’t so much get the impression that cryonics is unscamlike as that it belongs to a different genre of scam.
But I notice that I am rationalizing, and I need to go update.
ETA: No, I see now. Cryonics resembles-in-genre a religion. If you follow a certain burial rite, you will have eternal life in a better world. People generate religious objections: they say that it is morally wrong, that it destroys the immortal soul. People treat cryonics as though accepting it as valid would require them to give up their religious beliefs, even if those beliefs are actually compatible with cryonics.
Furthermore, cryonics doesn’t sell itself as a religion: it doesn’t claim to have answers to the great terrible questions that unsettle the mind. So people looking for a new religion tend not to choose it.
This leaves open the question of why cryonics is uncommon among self-professed atheists. Do so few “unbelievers” truly disbelieve?
I strongly suspect that it is more common per-capita among atheists than theists. If that is so, it suggests that maybe cryonics is fooling some atheists by setting off their religion-alarms, and/or the like-a-religion objection is only one of a suite of reasons why cryonics is unpopular.
Cryonics may be less uncommon among atheists than among theists, but that’s not what interests me.
Being cryopreserved is much more uncommon among atheists than not being cryopreserved is among atheists. That requires explanation.
The absurdity heuristic is a good enough explanation to first order.
The fact that cryonics is becoming more, not less, common is (weak) evidence that there’s good reasoning behind it; this evidence can be improved by noting that most irrational fast-growing fringe movements (i.e. Jehovah’s Witnesses) achieve their growth via making members afraid that they will lose out if they don’t evangelize. Cryonics doesn’t have that dynamic†.
† Even though cryonics would be cheaper if it were more popular, that’s more of a group coordination problem than an urgent personal incentive. I don’t see a lot of cryonics advocates feeling pressured to evangelize for it, just a lot of people who happen to think that they’re obviously right on the issue.
As was pointed out elsewhere in this thread, the absurdity heuristic alone doesn’t explain why cryonics is significantly less common than, say, Raëlism.
I don’t know the cause or cure, but I think geeks tend to be lousy at publicity.
Tentative theory—they’re independent-minded enough that they can’t really model people who want a little pixie dust (aka status, supernormal stimuli, or fantasies of value) sprinkled on things. Alternate theory: geeks like pixie dust, too, but it’s a different sort of pixie dust.
nitpick,; not all geeks are aspiring rationalists.
Cryonics is pretty much the opposite of all of those, in fact.
Except, usually, 10. “Something Doesn’t Feel Right” is a pretty good description of most people’s reaction to cryonics.
Yes, because those are the red flags associated with successful scams. Cryonics retains the creep-out factors, but misses the niche of effective marketing.
If you’d like a little science fiction on the subject, try Simak’s Why Call Them Back from Heaven?.