I suppose the article does a good job answering some of the common objections, but I still think the most important thing that’s stopping people from signing up is the fact that they just don’t care: after all, life sucks, but at least then you die.
That said, there is one argument that I find kind of powerful that articles like this don’t usually touch on (for somewhat obvious reasons): the point made in, for example, the preface to the finale of the Ultimate Meta Mega Crossover, that if we actually live in an infinite multiverse/many-worlds/nested simulverse/etc, we may be bound to find ourselves resurrected by someone eventually anyways, and cryonics could be a way to try to make sure that someone is friendly.
I’m not really sure what to make of that argument though. I wonder if there’s anybody who’s signed up because of reasons like that, despite not having any interest in cryonics in general?
“could be a way to try to make sure that someone is friendly.”
I don’t believe in nested simulverse etc but I feel I should point out that even if some of those things were true waking up one way does not preclude waking up one or more of the other ways in addition to that.
but I feel I should point out that even if some of those things were true waking up one way does not preclude waking up one or more of the other ways in addition to that.
You’re right. I should have said “make it more likely”, not “make sure”.
Same reason I don’t believe in god. As yet we have ~zero evidence for being in a simulation.
You’re right. I should have said “make it more likely”, not “make sure”.
Your odds of waking up in the hands of someone extremely unfriendly is unchanged. You’re just making it more likely that one fork of yourself might wake up in friendly hands.
As yet we have ~zero evidence for being in a simulation.
We have evidence (albeit no “smoking-gun evidence”) for eternal inflation, we have evidence for a flat and thus infinite universe, string theory is right now our best guess at what the theory of everything is like; these all predict a multiverse where everything possible happens and where somebody should thus be expected to simulate you.
Your odds of waking up in the hands of someone extremely unfriendly is unchanged. You’re just making it more likely that one fork of yourself might wake up in friendly hands.
Well, I think that qualifies. Our language is a bit inadequate for discussing situations with multiple future selves.
I find that about as convincing as “if you see a watch there must be a watchmaker” style arguments.
I don’t see the similarity here.
There are a number of ways theorized to test if we’re in various kinds of simulation and so far they’ve all turned up negative.
Oh?
String theory is famously bad at being usable to predict even mundane things even if it is elegant and “flat” is not the same as “infinite”.
It basically makes no new testable predictions right now. Doesn’t mean that it won’t do so in the future. (I have no opinion about string theory myself, but a lot of physicists do see it as promising. Some don’t. As far as I know, we currently know of no good alternative that’s less weird.)
By “the preface” do you mean the “memetic hazard warnings”?
Concepts contained in this story may cause SAN Checking in any mind not inherently stable at the third level of stress. Story may cause extreme existential confusion. Story is insane. The author recommends that anyone reading this story sign up with Alcor or the Cryonics Institute to have their brain preserved after death for later revival under controlled conditions. Readers not already familiar with this author should be warned that he is not bluffing.
I don’t think that is claiming that it is a rational response to claims about the word.
we may be bound to find ourselves resurrected by someone eventually anyways, and cryonics could be a way to try to make sure that someone is friendly.
This is a quantum immortality argument. If you actually believe in quantum immortality, you have bigger problems. Here is Eliezer offering cryonics as a solution to those, too.
By “the preface” do you mean the “memetic hazard warnings”?
Yes.
I don’t think that is claiming that it is a rational response to claims about the word.
I don’t get this. I see a very straightforward claim that cryonics is a rational response. What do you mean?
This is a quantum immortality argument. If you actually believe in quantum immortality, you have bigger problems. Here is Eliezer offering cryonics as a solution to those, too.
I’ve read that as well. It’s the same argument, essentially (quantum immortality doesn’t actually have much to do with MWI in particular). Basically, Eliezer is saying that quantum immortality is probably true, it could be very bad, and we should sign up for cryonics as a precaution.
Why would someone make major decisions based on metaphysical interpretations of quantum physics that are lacking experimental verifiability? That seems like poor life choices.
Tegmark 4 is not related to quantum physics. Quantum physics does not give an avenue for rescue simulations; in fact, it makes them harder.
As a simulationist, you can somewhat salvage traditional notions of fear if you retreat into a full-on absurdist framework where the point of your existence is to give a good showing to the simulating universes; alternately, risk avoidance is a good Schelling point for a high score. Furthermore, no matter how much utility you will be able to attain in Simulationist Heaven, this is your single shot to attain utility on Earth, and you shouldn’t waste it.
It does take the sting off death though, and may well be maladaptive in that sense. That said—it seems plausible a lot of simulating universes would end up with a “don’t rescue suicides” policy, purely out of a TDT desire to avoid the infinite-suicidal-regress loop.
I am continuously amused how catholic this cosmology ends up by sheer logic.
you can somewhat salvage traditional notions of fear … Simulationist Heaven … It does take the sting off death though
I find the often prevalent optimism on LW regarding this a bit strange. Frankly, I find this resurrection stuff quite terrifying myself.
I am continuously amused how catholic this cosmology ends up by sheer logic.
Yeah. It does make me wonder if we should take a lot more critical stance towards the premises that lead us to it. Sure enough, the universe is under no obligation to make any sense to us; but isn’t it still a bit suspicious that it’s turning out to be kind of bat-shit insane?
Perhaps you shouldn’t. That said, it is recommended by Eliezer Yudkowsky, and his words often weigh quite heavily here.
I don’t necessarily agree that lacking experimental verifiability means that we shouldn’t take something into account when making decisions, if we have enough reasons to think that it’s true nevertheless.
I suppose the article does a good job answering some of the common objections, but I still think the most important thing that’s stopping people from signing up is the fact that they just don’t care: after all, life sucks, but at least then you die.
That said, there is one argument that I find kind of powerful that articles like this don’t usually touch on (for somewhat obvious reasons): the point made in, for example, the preface to the finale of the Ultimate Meta Mega Crossover, that if we actually live in an infinite multiverse/many-worlds/nested simulverse/etc, we may be bound to find ourselves resurrected by someone eventually anyways, and cryonics could be a way to try to make sure that someone is friendly.
I’m not really sure what to make of that argument though. I wonder if there’s anybody who’s signed up because of reasons like that, despite not having any interest in cryonics in general?
I don’t believe in nested simulverse etc but I feel I should point out that even if some of those things were true waking up one way does not preclude waking up one or more of the other ways in addition to that.
You mean none of what I mentioned? Why not?
You’re right. I should have said “make it more likely”, not “make sure”.
Same reason I don’t believe in god. As yet we have ~zero evidence for being in a simulation.
Your odds of waking up in the hands of someone extremely unfriendly is unchanged. You’re just making it more likely that one fork of yourself might wake up in friendly hands.
We have evidence (albeit no “smoking-gun evidence”) for eternal inflation, we have evidence for a flat and thus infinite universe, string theory is right now our best guess at what the theory of everything is like; these all predict a multiverse where everything possible happens and where somebody should thus be expected to simulate you.
Well, I think that qualifies. Our language is a bit inadequate for discussing situations with multiple future selves.
I find that about as convincing as “if you see a watch there must be a watchmaker” style arguments.
There are a number of ways theorized to test if we’re in various kinds of simulation and so far they’ve all turned up negative.
String theory is famously bad at being usable to predict even mundane things even if it is elegant and “flat” is not the same as “infinite”.
I don’t see the similarity here.
Oh?
It basically makes no new testable predictions right now. Doesn’t mean that it won’t do so in the future. (I have no opinion about string theory myself, but a lot of physicists do see it as promising. Some don’t. As far as I know, we currently know of no good alternative that’s less weird.)
By “the preface” do you mean the “memetic hazard warnings”?
I don’t think that is claiming that it is a rational response to claims about the word.
This is a quantum immortality argument. If you actually believe in quantum immortality, you have bigger problems. Here is Eliezer offering cryonics as a solution to those, too.
Yes.
I don’t get this. I see a very straightforward claim that cryonics is a rational response. What do you mean?
I’ve read that as well. It’s the same argument, essentially (quantum immortality doesn’t actually have much to do with MWI in particular). Basically, Eliezer is saying that quantum immortality is probably true, it could be very bad, and we should sign up for cryonics as a precaution.
Why would someone make major decisions based on metaphysical interpretations of quantum physics that are lacking experimental verifiability? That seems like poor life choices.
Tegmark 4 is not related to quantum physics. Quantum physics does not give an avenue for rescue simulations; in fact, it makes them harder.
As a simulationist, you can somewhat salvage traditional notions of fear if you retreat into a full-on absurdist framework where the point of your existence is to give a good showing to the simulating universes; alternately, risk avoidance is a good Schelling point for a high score. Furthermore, no matter how much utility you will be able to attain in Simulationist Heaven, this is your single shot to attain utility on Earth, and you shouldn’t waste it.
It does take the sting off death though, and may well be maladaptive in that sense. That said—it seems plausible a lot of simulating universes would end up with a “don’t rescue suicides” policy, purely out of a TDT desire to avoid the infinite-suicidal-regress loop.
I am continuously amused how catholic this cosmology ends up by sheer logic.
I find the often prevalent optimism on LW regarding this a bit strange. Frankly, I find this resurrection stuff quite terrifying myself.
Yeah. It does make me wonder if we should take a lot more critical stance towards the premises that lead us to it. Sure enough, the universe is under no obligation to make any sense to us; but isn’t it still a bit suspicious that it’s turning out to be kind of bat-shit insane?
As opposed to the usual “I’ve had a few beers and it seemed like a good idea at the time”..? X-)
Perhaps you shouldn’t. That said, it is recommended by Eliezer Yudkowsky, and his words often weigh quite heavily here.
I don’t necessarily agree that lacking experimental verifiability means that we shouldn’t take something into account when making decisions, if we have enough reasons to think that it’s true nevertheless.
Arguments from authority are equally ill advised :)
By the way you will find that Mr. Yudkowsky’s positions are not held by everyone here.
Of course not. But whether people here agree with him or not, they usually at least think that his arguments need to be considered seriously.