I’d say that’s a good summary. To complete the grimness of the picture, the vast swarms of uploads toiling for the absolute minimum subsistence would be massively annihilated whenever they’d become even slightly obsolete or otherwise a suboptimal way to use the hardware on which they are running, and a recession in an upload economy would have a similar effect as a bad harvest leading to a cataclysmic famine among Malthusian farmers. As Hanson put it, “When life is cheap, death is cheap as well.”
On top of all that, to make things even more ghastly from the perspective of LW ideals, Hanson has made the shrewd observation that in order to make their subsistence more bearable, their behavior more productive and cooperative, and the acceptance of their eventual demise easier, uploads may well end up having their minds indoctrinated with religion and ideology, not trained in LW-style epistemic rationality (beyond what’s necessary for their main task, of course).
So the universe gets tiled with a sea of barely surviving people similar to humans, who are optimized towards whatever makes them most likely to remain productive in their dreary existence.
So like, the opposite of everyone becoming happy and rich.
Our ancestors were designed with pleasure and pain to motivate them in a near subsistence world. … Our descendants will be similarly adapted to find joy and meaning in their near subsistence lives.
I guess it was a stretch to say that its not like everyone becoming happy.
I don’t think that uploads would require nearly as much matter to lead a happy life. Basically right now if I want to have a nice, warm, comfortable place to sleep and a stomach full of nutritious food, I need to rearrange lots of stuff to physically construct those.
Contrast that with an upload, who can simply have a computer stimulating his emulated neurons in such a way as to make them believe that they were.
I think it’s likely that uploads will need a simulated environment anyway, and I doubt if a pleasant one is harder to simulate than an unpleasant one.
For that reason, I personally think that an upload is more likely to be living in a state of sensory darkness and deprivation (which I would find pretty terrifying) from having no stimulation than in an unpleasant simulation for not being able to afford a nicer one.
uploads may well end up having their minds indoctrinated with religion and ideology, not trained in LW-style epistemic rationality
IIRC, he says that religion and ideology are symptoms of modern-day wealth/excess, and future folk won’t be able to afford non-adaptive/non-correct beliefs. He calls our current position in history as “The Dreamtime”
we live in the brief but important “dreamtime” when delusions drove history. Our descendants will remember our era as the one where the human capacity to sincerely believe crazy non-adaptive things, and act on those beliefs, was dialed to the max.
IIRC, he says that religion and ideology are symptoms of modern-day wealth/excess, and future folk won’t be able to afford non-adaptive/non-correct beliefs. He calls our current position in history as “The Dreamtime”
Well, it is possible that he has said inconsistent things at different times, but in the posts I linked in my above comment, he argues (in my opinion plausibly) that the social mechanisms of control and coordination for ems may well end up being based on similar (epistemically) irrational beliefs as in historical human societies, i.e. religion, ideology, strict custom, etc. (“Onward Christian robots!,” as he put it.)
[Edit—forgot to add: ] And of course, adaptive and correct beliefs are not always one and the same, and it’s a huge fallacy to argue as if they were.
I’d say that’s a good summary. To complete the grimness of the picture, the vast swarms of uploads toiling for the absolute minimum subsistence would be massively annihilated whenever they’d become even slightly obsolete or otherwise a suboptimal way to use the hardware on which they are running, and a recession in an upload economy would have a similar effect as a bad harvest leading to a cataclysmic famine among Malthusian farmers. As Hanson put it, “When life is cheap, death is cheap as well.”
On top of all that, to make things even more ghastly from the perspective of LW ideals, Hanson has made the shrewd observation that in order to make their subsistence more bearable, their behavior more productive and cooperative, and the acceptance of their eventual demise easier, uploads may well end up having their minds indoctrinated with religion and ideology, not trained in LW-style epistemic rationality (beyond what’s necessary for their main task, of course).
So the universe gets tiled with a sea of barely surviving people similar to humans, who are optimized towards whatever makes them most likely to remain productive in their dreary existence.
So like, the opposite of everyone becoming happy and rich.
Well, in Hanson’s words: Poor Folks Do Smile
I guess it was a stretch to say that its not like everyone becoming happy.
I don’t think that uploads would require nearly as much matter to lead a happy life. Basically right now if I want to have a nice, warm, comfortable place to sleep and a stomach full of nutritious food, I need to rearrange lots of stuff to physically construct those.
Contrast that with an upload, who can simply have a computer stimulating his emulated neurons in such a way as to make them believe that they were.
I think it’s likely that uploads will need a simulated environment anyway, and I doubt if a pleasant one is harder to simulate than an unpleasant one.
For that reason, I personally think that an upload is more likely to be living in a state of sensory darkness and deprivation (which I would find pretty terrifying) from having no stimulation than in an unpleasant simulation for not being able to afford a nicer one.
IIRC, he says that religion and ideology are symptoms of modern-day wealth/excess, and future folk won’t be able to afford non-adaptive/non-correct beliefs. He calls our current position in history as “The Dreamtime”
Well, it is possible that he has said inconsistent things at different times, but in the posts I linked in my above comment, he argues (in my opinion plausibly) that the social mechanisms of control and coordination for ems may well end up being based on similar (epistemically) irrational beliefs as in historical human societies, i.e. religion, ideology, strict custom, etc. (“Onward Christian robots!,” as he put it.)
[Edit—forgot to add: ] And of course, adaptive and correct beliefs are not always one and the same, and it’s a huge fallacy to argue as if they were.