I’m middle-aged now, and a pattern I’ve noticed as I get older is that I keep having to adapt my sense of what is valuable, because desirable things that used to be scarce for me keep becoming abundant. Some of this is just growing up, e.g. when I was a kid my candy consumption was regulated by my parents, but then I had to learn to regulate it myself. I think humans are pretty well-adapted to that sort of value drift over the life course. But then there’s the value drift due to rapid technological change, which I think is more disorienting. E.g. I invested a lot of my youth into learning to use software which is now obsolete. It feels like my youthful enthusiasm for learning new software skills, and comparative lack thereof as I get older, was an adaptation to a world where valuable skills learned in childhood could be expected to mostly remain valuable throughout life. It felt like a bit of a rug-pull how much that turned out not to be the case w.r.t. software.
But the rise of generative AI has really accelerated this trend, and I’m starting to feel adrift and rudderless. One of the biggest changes from scarcity to abundance in my life was that of interesting information, enabled by the internet. I adapted to it by re-centering my values around learning skills and creating things. As I contemplate what AI can already do, and extrapolate that into the near future, I can feel my motivation to learn and create flagging.
If, and to the extent that, we get a “good” singularity, I expect that it will have been because the alignment problem turned out to be not that hard, the sort of thing we could muddle through improvisationally. But that sort of singularity seems unlikely to preserve something as delicately balanced as the way that (relatively well-off) humans get a sense of meaning and purpose from the scarcity of desirable things. I would still choose a world that is essentially a grand theme park full of delightful experience machines over the world as it is now, with all its sorrows, and certainly I would choose theme-park world over extinction. But still … OP beautifully crystalizes the apprehension I feel about even the more optimistic end of the spectrum of possible futures for humanity that are coming into view.
But that sort of singularity seems unlikely to preserve something as delicately balanced as the way that (relatively well-off) humans get a sense of meaning and purpose from the scarcity of desirable things.
I think our world actually has a great track record of creating artificial scarcity for the sake of creating meaning (in terms of enjoyment, striving to achieve a goal, sense of accomplishment). Maybe “purpose” in the most profound sense is tough to do artificially, but I’m not sure that’s something most people feel a whole lot of anyway?
I’m pretty optimistic about our ability to adapt to a society of extreme abundance by creating “games” (either literal or social) that become very meaningful to those engaged in them.
I’m middle-aged now, and a pattern I’ve noticed as I get older is that I keep having to adapt my sense of what is valuable, because desirable things that used to be scarce for me keep becoming abundant. Some of this is just growing up, e.g. when I was a kid my candy consumption was regulated by my parents, but then I had to learn to regulate it myself. I think humans are pretty well-adapted to that sort of value drift over the life course. But then there’s the value drift due to rapid technological change, which I think is more disorienting. E.g. I invested a lot of my youth into learning to use software which is now obsolete. It feels like my youthful enthusiasm for learning new software skills, and comparative lack thereof as I get older, was an adaptation to a world where valuable skills learned in childhood could be expected to mostly remain valuable throughout life. It felt like a bit of a rug-pull how much that turned out not to be the case w.r.t. software.
But the rise of generative AI has really accelerated this trend, and I’m starting to feel adrift and rudderless. One of the biggest changes from scarcity to abundance in my life was that of interesting information, enabled by the internet. I adapted to it by re-centering my values around learning skills and creating things. As I contemplate what AI can already do, and extrapolate that into the near future, I can feel my motivation to learn and create flagging.
If, and to the extent that, we get a “good” singularity, I expect that it will have been because the alignment problem turned out to be not that hard, the sort of thing we could muddle through improvisationally. But that sort of singularity seems unlikely to preserve something as delicately balanced as the way that (relatively well-off) humans get a sense of meaning and purpose from the scarcity of desirable things. I would still choose a world that is essentially a grand theme park full of delightful experience machines over the world as it is now, with all its sorrows, and certainly I would choose theme-park world over extinction. But still … OP beautifully crystalizes the apprehension I feel about even the more optimistic end of the spectrum of possible futures for humanity that are coming into view.
I think our world actually has a great track record of creating artificial scarcity for the sake of creating meaning (in terms of enjoyment, striving to achieve a goal, sense of accomplishment). Maybe “purpose” in the most profound sense is tough to do artificially, but I’m not sure that’s something most people feel a whole lot of anyway?
I’m pretty optimistic about our ability to adapt to a society of extreme abundance by creating “games” (either literal or social) that become very meaningful to those engaged in them.