When I think of poor hands dealt to humans, these days, I think first of death and old age. Everyone’s got to have some intelligence level or other, and the important thing from a fun-theoretical perspective is that it should ought to increase over time, not decrease like now.
This is a really important point, and I want to make certain that I get it right—especially to you personally, Mr. Yudkowsky, since you seem like someone with a higher-than-epsilon chance of actually doing something about all of this.
Solve people’s lack of motivation and expertise for self-improvement before you handle our old age and death, please. Please.
Because, speaking as someone caught deep in the throes of a flawed optimization loop, the prospect of being caught in such a loop for centuries is terrifying.
Just as initial conditions are hideously unfair, life-paths are also hideously unfair, and the universe does not owe anyone the capacity, let alone the opportunity, to achieve meaning and purpose and happiness in their life.
And I don’t know about others, but being condemned to an eternity as myself, damned to struggle futilely to achieve some understanding or purpose that will always be one level higher than I can reach, seems far, far worse than simply recycling my constituent hardware, freeing up my clock cycles, and letting something else take my place. Given the utterly unfair, stochastic nature of the universe, maybe it will be something better. But if it isn’t, at least it won’t have to suffer with its inadequacy forever, either.
Don’t get my doom-and-gloom wrong, though—I would love to be immortal, and free, and capable of pursuing happiness. But I am terrified that, in my current mental configuration, immortality would simply mean an eternity of self-inflicted suffering. And I am most certainly not alone in this fear.
Keep working on your Series here—they’re insightful and important, and they’re the first thing I’ve heard in almost 20 years that doesn’t sound like utter bullshit—and keep your fire to save the world, because that fire is the one thing in the universe worth protecting—but remember that some parts of the world may not be better off saved, if you can’t heal them first.
I disagree with the fundamental premise here. I would much rather be immortal and stuck in an akratic loop for a few centuries—because a few centuries is very finite and I’ll still be alive at the end.
While even if I become absurdly productive and self-controlled, I will still die like a dog of disease & decay in the likely event there is no Singularity and SENS fails.
Remember Steve Jobs: he used all the cutting-edge treatments and even used his billions to buy his way to the head of the transplant line—and died anyway.
Akrasia doesn’t begin to describe the problem. I’m going to quote a line from HPMoR that resonated strongly with me:
“You could call it heroic responsibility, maybe,” Harry Potter said. “Not like the usual sort. It means that whatever happens, no matter what, it’s always your fault. Even if you tell Professor McGonagall, she’s not responsible for what happens, you are. Following the school rules isn’t an excuse, someone else being in charge isn’t an excuse, even trying your best isn’t an excuse. There just aren’t any excuses, you’ve got to get the job done no matter what.”
I get heroic responsibility. I’ve felt it in my gut since I was five. When I was 13, and it finally dawned on me that everyone around me was miserable and terrified and angry because the God they were praying to wasn’t listening, my immediate resolution was to abandon worshipping him, and attempt to become a better God myself.
But, some of us aren’t as smart as others, or as charismatic, or as willful, or as physically or mentally strong or resilient. We hear the call, but we don’t have what it takes to answer it properly.
And that’s our fault, too.
And we can’t just stop listening. Not knowing that people need saving isn’t any more of an excuse than not being strong enough to save them. Re-wiring your mind to not feel the crushing need to save them is ALSO a cop-out.
So… yeah. And lest anyone think I’m trying to be self-congratulatory here about my “superior morality”, please understand that I am most assuredly not doing it right—this is a bug, not a feature.
I disagree with the fundamental premise here. I would much rather be immortal and stuck in an akratic loop for a few centuries—because a few centuries is very finite and I’ll still be alive at the end.
Meanwhile, all the immortals with a greater sense of urgency about things have outstripped your ability to ever catch up.
I don’t think a longer life is a good reason for taking things easier.
I think you misread. The choice is between fixing akrasia now, and getting immortality now. If you go for curing akrasia, you’ll probably die before immortality gets done, even if you do get to enjoy the benefits of being insanely effective in the meantime. Whereas if you first make sure to not die, you can fix your akrasia later, and then be insanely effective for the rest of however long your new lifespan is.
I’d certainly take an unlimited lifespan plus akrasia-cure-300-years-later over normal human lifespan + akrasia-cure-now.
The choice is between fixing akrasia now, and getting immortality now.
By the Rule of Comparative Advantage, on a planet of billions of people, both can be worked on at once. Since Eliezer—the person originally addressed—is not a biologist, there’s nothing he’s likely to be able to do about senescence, beyond convincing other people that curing death would be great and hoping they come up with something. Fixing akrasia, though, is something that there is at yet no specialised knowledge about, so he has about as much chance as anyone of similar smarts.
Whereas if you first make sure to not die
Ok, that’s my New Year Resolution: don’t die. Sorted!
I’d certainly take an unlimited lifespan plus akrasia-cure-300-years-later over normal human lifespan + akrasia-cure-now.
I’d take all of it right now. And a pony. (Yes, I’m rejecting the hypothetical. I do that.)
This is a really important point, and I want to make certain that I get it right—especially to you personally, Mr. Yudkowsky, since you seem like someone with a higher-than-epsilon chance of actually doing something about all of this.
Solve people’s lack of motivation and expertise for self-improvement before you handle our old age and death, please. Please.
Because, speaking as someone caught deep in the throes of a flawed optimization loop, the prospect of being caught in such a loop for centuries is terrifying.
Just as initial conditions are hideously unfair, life-paths are also hideously unfair, and the universe does not owe anyone the capacity, let alone the opportunity, to achieve meaning and purpose and happiness in their life.
And I don’t know about others, but being condemned to an eternity as myself, damned to struggle futilely to achieve some understanding or purpose that will always be one level higher than I can reach, seems far, far worse than simply recycling my constituent hardware, freeing up my clock cycles, and letting something else take my place. Given the utterly unfair, stochastic nature of the universe, maybe it will be something better. But if it isn’t, at least it won’t have to suffer with its inadequacy forever, either.
Don’t get my doom-and-gloom wrong, though—I would love to be immortal, and free, and capable of pursuing happiness. But I am terrified that, in my current mental configuration, immortality would simply mean an eternity of self-inflicted suffering. And I am most certainly not alone in this fear.
Keep working on your Series here—they’re insightful and important, and they’re the first thing I’ve heard in almost 20 years that doesn’t sound like utter bullshit—and keep your fire to save the world, because that fire is the one thing in the universe worth protecting—but remember that some parts of the world may not be better off saved, if you can’t heal them first.
I disagree with the fundamental premise here. I would much rather be immortal and stuck in an akratic loop for a few centuries—because a few centuries is very finite and I’ll still be alive at the end.
While even if I become absurdly productive and self-controlled, I will still die like a dog of disease & decay in the likely event there is no Singularity and SENS fails.
Remember Steve Jobs: he used all the cutting-edge treatments and even used his billions to buy his way to the head of the transplant line—and died anyway.
Akrasia doesn’t begin to describe the problem. I’m going to quote a line from HPMoR that resonated strongly with me:
I get heroic responsibility. I’ve felt it in my gut since I was five. When I was 13, and it finally dawned on me that everyone around me was miserable and terrified and angry because the God they were praying to wasn’t listening, my immediate resolution was to abandon worshipping him, and attempt to become a better God myself.
But, some of us aren’t as smart as others, or as charismatic, or as willful, or as physically or mentally strong or resilient. We hear the call, but we don’t have what it takes to answer it properly.
And that’s our fault, too.
And we can’t just stop listening. Not knowing that people need saving isn’t any more of an excuse than not being strong enough to save them. Re-wiring your mind to not feel the crushing need to save them is ALSO a cop-out.
So… yeah. And lest anyone think I’m trying to be self-congratulatory here about my “superior morality”, please understand that I am most assuredly not doing it right—this is a bug, not a feature.
Well...
Meanwhile, all the immortals with a greater sense of urgency about things have outstripped your ability to ever catch up.
I don’t think a longer life is a good reason for taking things easier.
I think you misread. The choice is between fixing akrasia now, and getting immortality now. If you go for curing akrasia, you’ll probably die before immortality gets done, even if you do get to enjoy the benefits of being insanely effective in the meantime. Whereas if you first make sure to not die, you can fix your akrasia later, and then be insanely effective for the rest of however long your new lifespan is.
I’d certainly take an unlimited lifespan plus akrasia-cure-300-years-later over normal human lifespan + akrasia-cure-now.
By the Rule of Comparative Advantage, on a planet of billions of people, both can be worked on at once. Since Eliezer—the person originally addressed—is not a biologist, there’s nothing he’s likely to be able to do about senescence, beyond convincing other people that curing death would be great and hoping they come up with something. Fixing akrasia, though, is something that there is at yet no specialised knowledge about, so he has about as much chance as anyone of similar smarts.
Ok, that’s my New Year Resolution: don’t die. Sorted!
I’d take all of it right now. And a pony. (Yes, I’m rejecting the hypothetical. I do that.)
Well done, you missed the point.