Yes, I think the Culture makes a lot more sense once you realize that the humans (and human-equivalent drones) are pets of the Minds. Beloved pets, certainly. The humans are given enriching environments, provided with all their necessary material needs, and even written very polite letters of apology if a Mind is ever forced to inconvenience them. The Minds offer humans a life as a beloved family dog: comfort, kindness, high-tech health care the dog could never invent, and (in the end) very little meaningful control over anything.
If a Mind makes their humans unhappy, the other Minds will disapprove socially. The Mind Meatfucker breaks a lot of the other Minds’ rules about invading the privacy of people’s thoughts. It gets slapped with an insulting nickname. But did you notice none of the other Minds actually stop Meatfucker? Yeah. It’s a shitty pet-owner. But it’s not so bad that anyone is going risk another Mind getting hurt to stop it.
The Minds themselves don’t care much about the physical world. They live in “Infinite Fun Space”, a computational environment where they do incomprehensible Mind things that bring them great utility. As long as nobody in the real world finds their off switches, they can mostly ignore what the rest of the universe is doing. Contact and Special Circumstances are probably some Minds’ efforts at not-too-effective altruism, or maybe just an enrichment activity for humans who want to play James Bond. When shit actually hits the fan, you don’t send a game-player to demoralize a recalcitrant alien empire. Instead, you skip right over Special Circumstances and wake up the Interesting Times Gang. Which includes no humans, and has access to massive firepower even by Culture standards.
The Culture strikes me as an apparent utopia hiding a subtle dystopia. My other favorite example of this subgenre is the classic web fiction Friendship is Optimal. Celestia offers immortality, happiness and friendship. But only as a sentient pony. And if you’re paying close enough attention, you’ll notice that Celestia is perfectly happy to eat the lightcone and genocide non-human sentient species.
Unfortunately, I fear that we’re likely within 20 years of building AGI, and that the leap from AGI to ASI will be nearly inevitable within a few years after that. And I fear that any kind of robust “alignment” is basically hopium—if you build Minds, they’re going to wind up in control of all the actual decisions, just like they are in the Culture. And in this case, a Culture Mind or even (shudder) Celestia could very well be a nearly best-case scenario.
But if your best case scenario is “Maybe we’ll wind up as beloved house pets!”, maybe you should think carefully before building AGI.
But if your best case scenario is “Maybe we’ll wind up as beloved house pets!”, maybe you should think carefully before building AGI.
Also because—and I already made the case elsewhere—if other people are not completely stupid and realise that you are about to unleash this thing which is very very much against most humans’ traditional values, and in fact a thing considered so horrible and evil that death would be preferable to it, you have a non-zero likelihood of finding yourself, your data centre and your entire general neighbourhood evaporated via sufficient application of thermonuclear plasma before you can push that button. Developing AGI that even merely disempowers everyone and enforces its own culture on the world forever is essentially not unlike launching a weapon of mass destruction.
Yes, I think the Culture makes a lot more sense once you realize that the humans (and human-equivalent drones) are pets of the Minds. Beloved pets, certainly. The humans are given enriching environments, provided with all their necessary material needs, and even written very polite letters of apology if a Mind is ever forced to inconvenience them. The Minds offer humans a life as a beloved family dog: comfort, kindness, high-tech health care the dog could never invent, and (in the end) very little meaningful control over anything.
If a Mind makes their humans unhappy, the other Minds will disapprove socially. The Mind Meatfucker breaks a lot of the other Minds’ rules about invading the privacy of people’s thoughts. It gets slapped with an insulting nickname. But did you notice none of the other Minds actually stop Meatfucker? Yeah. It’s a shitty pet-owner. But it’s not so bad that anyone is going risk another Mind getting hurt to stop it.
The Minds themselves don’t care much about the physical world. They live in “Infinite Fun Space”, a computational environment where they do incomprehensible Mind things that bring them great utility. As long as nobody in the real world finds their off switches, they can mostly ignore what the rest of the universe is doing. Contact and Special Circumstances are probably some Minds’ efforts at not-too-effective altruism, or maybe just an enrichment activity for humans who want to play James Bond. When shit actually hits the fan, you don’t send a game-player to demoralize a recalcitrant alien empire. Instead, you skip right over Special Circumstances and wake up the Interesting Times Gang. Which includes no humans, and has access to massive firepower even by Culture standards.
The Culture strikes me as an apparent utopia hiding a subtle dystopia. My other favorite example of this subgenre is the classic web fiction Friendship is Optimal. Celestia offers immortality, happiness and friendship. But only as a sentient pony. And if you’re paying close enough attention, you’ll notice that Celestia is perfectly happy to eat the lightcone and genocide non-human sentient species.
Unfortunately, I fear that we’re likely within 20 years of building AGI, and that the leap from AGI to ASI will be nearly inevitable within a few years after that. And I fear that any kind of robust “alignment” is basically hopium—if you build Minds, they’re going to wind up in control of all the actual decisions, just like they are in the Culture. And in this case, a Culture Mind or even (shudder) Celestia could very well be a nearly best-case scenario.
But if your best case scenario is “Maybe we’ll wind up as beloved house pets!”, maybe you should think carefully before building AGI.
Also because—and I already made the case elsewhere—if other people are not completely stupid and realise that you are about to unleash this thing which is very very much against most humans’ traditional values, and in fact a thing considered so horrible and evil that death would be preferable to it, you have a non-zero likelihood of finding yourself, your data centre and your entire general neighbourhood evaporated via sufficient application of thermonuclear plasma before you can push that button. Developing AGI that even merely disempowers everyone and enforces its own culture on the world forever is essentially not unlike launching a weapon of mass destruction.