Infinite torture means to tweak someone beyond recognition. To closely resemble what infinite torture literally means you’ll have to have the means to value it or otherwise it’s an empty threat. You also have to account for acclimatization, that one gets not used to it, or it wouldn’t be effective torture anymore.
I doubt this can be done while retaining one’s personality or personhood to such an extent that the threat of infinite torture would be directed at you, or a copy of youself and not something vaguely resembling you. In which case we could as well talk about torturing a pain-experience-optimization-device.
Another thought is that what we really fear isn’t torture but bodily injury and or dying. If someone was going to threat to cut your arms, what would really be bad about it are the consequences of living without arms, not the pain associated with it. If I can’t die anyway, as in an infinite torture scenario, neither need to be functional, all that’s left is pain experience. Now experiencing pain isn’t fun by definition but is neither a very complex issue. It’s not like you’re were contemplating much about the consequences while being tortured for the past infinite years.
There are other forms of torture. But all other forms, which we do not directly associate with the term pain, are even harder to achieve. If you are knowing person no AI will be able to cause you psychic problems and retain your personality at the same time. That is, it could surely make you suffer from depression, but since you are not the type of person who’d naturally suffer from depression, it’s not you. It might resemble you, but basically it’s someone else. And that’s as good as no torture at all really.
Just some quick thoughts that came to my mind today while taking a shower :-)
Infinite torture means to tweak someone beyond recognition.
Infinite torture is too big to think about. Try this:
“Imagine yourself naked and shut up in an iron coffin, that is put into the furnace and heated to a fiery red heat. You know what it is like to burn a finger—how much more awful will it be to find yourself in that coffin, burning all over, and yet God in His power will not let you be consumed. You suffocate but cannot expire. You cannot endure it—but you will, for a thousand years, and a thousand thousand, and that is less than a moment compared with the eternity of suffering that lies before you.”
That is a reconstruction of a recollection of something I once read of a Christian preacher of former times saying. For more along these lines, just try this Google search. “Terror of hell” was a recognised psychological malady back in the day.
Will you blithely say, as you fall into the AI’s clutches, “it won’t be me experiencing it”?
The twins’ happiness is apparent to all around them—they laughed and played the entire day the “Nightline” crew was with them. “They … have this connection between their, what’s called the thalamus, between the thalami, one in each to the other,” said Dr. Doug Cochrane, the twins’ pediatric neurologist. “So there’s actually a bridge of neural tissues in these twins, which makes them quite unique.” It also makes them impossible to separate. Mom Felicia Hogan and others believe the connection has given the twins unique powers. “They share a lot of things normal conjoined twins don’t,” she said. “They have special abilities to see each other, see what each other’s seeing through each other’s eyes.”
Why choose the future over the present? I just cannot associate enough with that being that is tortured to feel anything now. I have a very good idea of what I want at every moment. And either I win like this or I don’t care at all what’s going to happen otherwise. That is, I’m not going to accept any tradeoff, any compromise big enough to extort it from me by means of infinite torture.
Anyway, as I wrote before, given the Mathematical Universe/Many-Worlds/Simulation Argument everything good and bad is happening to you, i.e. is timeless.
I had several blood vessel cauterizations in my nose over my lifetime. They just use a kind of soldering iron to burn your mucous membrane. All without any anesthetization. I always cried when I was younger. Now I’m so used to it that I don’t care much anymore. So yeah, an AI might do that to my eyes but why care, it’ll have to restore them again anyway. What a waste of resources :-)
Oh and that preacher has no imagination. I can imagine scenarios MUCH worse than that.
I’d choose (b) without the amnesia because I do not cooperate with such scum and it be an interesting experience and test to figure out what to choose in similar cases.
I’d choose (b) without the amnesia because I do not cooperate with such scum and it be an interesting experience and test to figure out what to choose in similar cases.
Ming would not ask this, but—Are you sure you want to be tortured? Finding third options in bad scenarios is cool, but not if they’re strictly worse than the options already presented. This is like choosing suicide in Newcomb’s problem.
That is, it could surely make you suffer from depression, but since you are not the type of person who’d naturally suffer from depression, it’s not you.
Either you really don’t understand depression, or your definition of identity revolves around some very transient chemical conditions in the body.
It wasn’t meant that literally. What I rather meant is that the AI can make you fear pink balloons and expose you to a world full of pink balloons. But if you extrapolate this reasoning then the AI could as well just torture a torture-optimization-device.
Here is where I’m coming from. All my life since abandoning religion I fear something could happen to my brain that makes me fall for such delusions again. But I think that fear is unreasonable. If I’d like to become religious again I wouldn’t care because that’s my preference then. In other words, I’m suffering from my imagination of a impossible being that is me but not really me, that is dumb enough to strive to be religious but fears being religious.
That means some AI could torture me but not infinitely so while retaining anything I’d care about on average previous to being tortured infinitely.
P.S.
I’m just having some fun trying to figure out why some people here are very horrified by such scenarios. I can’t help but feel nothing about it.
I assign high negative utility to the torture of any entity. The scenario might be more salient if the entity in question is me (for whatever definition of identity you care to use), but I don’t care much more about myself than I do other intelligences.
The only reasons we care about other people is either to survive, i.e. get what we want, or because it is part of our preferences to see other people being happy. Accordingly, trying to maximize happiness for everybody can be seen as purely selfish. Either as an effort to survive, by making everybody wanting to make everybody else happy, given that not you but somebody else wins. Or simply because it makes oneself happy.
You can reduce every possible motivation to selfishness if you like, but that makes the term kind of useless; if all choices are selfish, describing a particular choice as selfish has zero information content.
Accordingly, trying to maximize happiness for everybody can be seen as purely selfish. Either as an effort to survive, by making everybody wanting to make everybody else happy, given that not you but somebody else wins. Or simply because it makes oneself happy.
You should be more cautious about telling other people what their motivations are. I woulddie to save the world, and I don’t seem to be alone in this preference. And this neither helps me survive nor makes me momentarily happy enough to offset the whole dying thing.
You should be careful not to conflate “preference” and “things that make oneself happy”. Or make that a more clearly falsifiable component of your hypothesis.
Why would anyone have a preference detached from their personal happiness? I do what I do because it makes me feel good because I think it is the right thing to do. Doing the wrong thing deliberately makes me unhappy.
I don’t care much more about myself than I care about other intelligences.
I care about other intelligences and myself to an almost equal extent.
I care about myself and other intelligences.
I care about myself. I care about other intelligences.
I care about my preferences.
What does it mean to care more about others? Who’s caring here? If you want other people to be happy, why do you want it if not for your own comfort?
I’m vegetarian because I don’t like unnecessary suffering. That is, I care about myself not feeling bad because if others are unhappy I’m also unhappy. If you’d rather die than to cause a lot of suffering in others that is not to say that you care more about others than yourself, that is nonsense.
I’m not sure how this is related to what I was talking about. Are you suggesting you’ll be tortured by doing what you want expect that you suffer from chronic pain along the way without being able to stop it?
I’m sorry, this is just an open thread comment, not a top-level post. Aren’t we allowed to just chat and get feedback without thoroughly contemplating a subject?
Another thought is that what we really fear isn’t torture but bodily injury and or dying.
That’s what you said.
I’m not sure what you mean by your second sentence immediately above, but, while I’m not convinced by the infinite torture scenario that’s worrying some people here (I probably don’t have a full understanding of the arguments), I do think extreme pain (possibly with enough respite to retain personality) would be part of an infinite torture scenario.
As it happens, I have a friend with trigeminal neuralgia—a disorder of serious pain without other damage. It can drive people to suicide.
I wouldn’t take that sort of problem lightly in any case, but knowing someone who has one does add something of an edge.
As for what people are allowed here, I don’t have the power to enforce anything. I am applying social pressure because I don’t come here to read ill-thought-out material, and even though I’m answering you, I don’t enjoy addressing the details of what’s wrong with it.
I am applying social pressure because I don’t come here to read ill-thought-out material...
Awww, I’ll leave you alone now. I’ve maybe posted 50 comments since the beginning of LW, so don’t worry. The recent surge was just triggered by this fascinating incident where some posts and lots of comments have been deleted to protect us from dangerous knowledge.
I wish you good luck building your space empires and superintelligences.
I’m sorry, this is just an open thread comment, not a top-level post. Aren’t we allowed to just chat and get feedback without thoroughly contemplating a subject?
There’s nobody compelling you to reply to the comments you feel are too thoroughly contemplating a subject.
A more direct problem with 3^^^^^^3 torture to me:
Duplicate moments have no value As Far As I Am Concerned.
Unless someone is being tortured in 3^^^^^^3 different ways, or is carrying on with an amazingly extended (and non-repititious) life while the infinite torture is occuring, most of the moments are going to be precise repeats of previous moments.
Infinite torture means to tweak someone beyond recognition. To closely resemble what infinite torture literally means you’ll have to have the means to value it or otherwise it’s an empty threat. You also have to account for acclimatization, that one gets not used to it, or it wouldn’t be effective torture anymore.
I doubt this can be done while retaining one’s personality or personhood to such an extent that the threat of infinite torture would be directed at you, or a copy of youself and not something vaguely resembling you. In which case we could as well talk about torturing a pain-experience-optimization-device.
Another thought is that what we really fear isn’t torture but bodily injury and or dying. If someone was going to threat to cut your arms, what would really be bad about it are the consequences of living without arms, not the pain associated with it. If I can’t die anyway, as in an infinite torture scenario, neither need to be functional, all that’s left is pain experience. Now experiencing pain isn’t fun by definition but is neither a very complex issue. It’s not like you’re were contemplating much about the consequences while being tortured for the past infinite years.
There are other forms of torture. But all other forms, which we do not directly associate with the term pain, are even harder to achieve. If you are knowing person no AI will be able to cause you psychic problems and retain your personality at the same time. That is, it could surely make you suffer from depression, but since you are not the type of person who’d naturally suffer from depression, it’s not you. It might resemble you, but basically it’s someone else. And that’s as good as no torture at all really.
Just some quick thoughts that came to my mind today while taking a shower :-)
Infinite torture is too big to think about. Try this:
“Imagine yourself naked and shut up in an iron coffin, that is put into the furnace and heated to a fiery red heat. You know what it is like to burn a finger—how much more awful will it be to find yourself in that coffin, burning all over, and yet God in His power will not let you be consumed. You suffocate but cannot expire. You cannot endure it—but you will, for a thousand years, and a thousand thousand, and that is less than a moment compared with the eternity of suffering that lies before you.”
That is a reconstruction of a recollection of something I once read of a Christian preacher of former times saying. For more along these lines, just try this Google search. “Terror of hell” was a recognised psychological malady back in the day.
Will you blithely say, as you fall into the AI’s clutches, “it won’t be me experiencing it”?
I don’t know.
Who’s who? Conjoined Twins Share a Brain
Why choose the future over the present? I just cannot associate enough with that being that is tortured to feel anything now. I have a very good idea of what I want at every moment. And either I win like this or I don’t care at all what’s going to happen otherwise. That is, I’m not going to accept any tradeoff, any compromise big enough to extort it from me by means of infinite torture.
Anyway, as I wrote before, given the Mathematical Universe/Many-Worlds/Simulation Argument everything good and bad is happening to you, i.e. is timeless.
I had several blood vessel cauterizations in my nose over my lifetime. They just use a kind of soldering iron to burn your mucous membrane. All without any anesthetization. I always cried when I was younger. Now I’m so used to it that I don’t care much anymore. So yeah, an AI might do that to my eyes but why care, it’ll have to restore them again anyway. What a waste of resources :-)
Oh and that preacher has no imagination. I can imagine scenarios MUCH worse than that.
What would your answer be to this conundrum?
I’d choose (b) without the amnesia because I do not cooperate with such scum and it be an interesting experience and test to figure out what to choose in similar cases.
Ming would not ask this, but—Are you sure you want to be tortured? Finding third options in bad scenarios is cool, but not if they’re strictly worse than the options already presented. This is like choosing suicide in Newcomb’s problem.
Either you really don’t understand depression, or your definition of identity revolves around some very transient chemical conditions in the body.
Good point. I should have picked up on that.
I’m a manic depressive. Does this mean I’m a different person at each level along the scale between mania and depression?
It wasn’t meant that literally. What I rather meant is that the AI can make you fear pink balloons and expose you to a world full of pink balloons. But if you extrapolate this reasoning then the AI could as well just torture a torture-optimization-device.
Here is where I’m coming from. All my life since abandoning religion I fear something could happen to my brain that makes me fall for such delusions again. But I think that fear is unreasonable. If I’d like to become religious again I wouldn’t care because that’s my preference then. In other words, I’m suffering from my imagination of a impossible being that is me but not really me, that is dumb enough to strive to be religious but fears being religious.
That means some AI could torture me but not infinitely so while retaining anything I’d care about on average previous to being tortured infinitely.
P.S. I’m just having some fun trying to figure out why some people here are very horrified by such scenarios. I can’t help but feel nothing about it.
I assign high negative utility to the torture of any entity. The scenario might be more salient if the entity in question is me (for whatever definition of identity you care to use), but I don’t care much more about myself than I do other intelligences.
The only reasons we care about other people is either to survive, i.e. get what we want, or because it is part of our preferences to see other people being happy. Accordingly, trying to maximize happiness for everybody can be seen as purely selfish. Either as an effort to survive, by making everybody wanting to make everybody else happy, given that not you but somebody else wins. Or simply because it makes oneself happy.
You can reduce every possible motivation to selfishness if you like, but that makes the term kind of useless; if all choices are selfish, describing a particular choice as selfish has zero information content.
You should be more cautious about telling other people what their motivations are. I would die to save the world, and I don’t seem to be alone in this preference. And this neither helps me survive nor makes me momentarily happy enough to offset the whole dying thing.
That terminology is indeed useless. All it does is to obfuscate matters.
What’s your point anyway?
You should be careful not to conflate “preference” and “things that make oneself happy”. Or make that a more clearly falsifiable component of your hypothesis.
Why would anyone have a preference detached from their personal happiness? I do what I do because it makes me feel good because I think it is the right thing to do. Doing the wrong thing deliberately makes me unhappy.
I don’t care much more about myself than I care about other intelligences.
I care about other intelligences and myself to an almost equal extent.
I care about myself and other intelligences.
I care about myself. I care about other intelligences.
I care about my preferences.
What does it mean to care more about others? Who’s caring here? If you want other people to be happy, why do you want it if not for your own comfort?
I’m vegetarian because I don’t like unnecessary suffering. That is, I care about myself not feeling bad because if others are unhappy I’m also unhappy. If you’d rather die than to cause a lot of suffering in others that is not to say that you care more about others than yourself, that is nonsense.
Severe chronic pain fouls up people’s lives, even if the pain isn’t related to damage. Pain hurts, and it’s distracting.
I really wish you’d think more before you post.
I’m not sure how this is related to what I was talking about. Are you suggesting you’ll be tortured by doing what you want expect that you suffer from chronic pain along the way without being able to stop it?
I’m sorry, this is just an open thread comment, not a top-level post. Aren’t we allowed to just chat and get feedback without thoroughly contemplating a subject?
That’s what you said.
I’m not sure what you mean by your second sentence immediately above, but, while I’m not convinced by the infinite torture scenario that’s worrying some people here (I probably don’t have a full understanding of the arguments), I do think extreme pain (possibly with enough respite to retain personality) would be part of an infinite torture scenario.
As it happens, I have a friend with trigeminal neuralgia—a disorder of serious pain without other damage. It can drive people to suicide.
I wouldn’t take that sort of problem lightly in any case, but knowing someone who has one does add something of an edge.
As for what people are allowed here, I don’t have the power to enforce anything. I am applying social pressure because I don’t come here to read ill-thought-out material, and even though I’m answering you, I don’t enjoy addressing the details of what’s wrong with it.
Awww, I’ll leave you alone now. I’ve maybe posted 50 comments since the beginning of LW, so don’t worry. The recent surge was just triggered by this fascinating incident where some posts and lots of comments have been deleted to protect us from dangerous knowledge.
I wish you good luck building your space empires and superintelligences.
Sorry again for stealing your precious time :-)
There’s nobody compelling you to reply to the comments you feel are too thoroughly contemplating a subject.
A more direct problem with 3^^^^^^3 torture to me:
Duplicate moments have no value As Far As I Am Concerned.
Unless someone is being tortured in 3^^^^^^3 different ways, or is carrying on with an amazingly extended (and non-repititious) life while the infinite torture is occuring, most of the moments are going to be precise repeats of previous moments.