To be clear, I’m expecting scenarios much more clearly bad than that, like “the universe is almost entirely populated by worker drone AIs and there are like 5 humans who are high all the time and not even in a way they would have signed up for, and then one human who is being copied repeatedly and is starkly superintelligent thanks to boosts from their AI assistants but who had replaced almost all of their preferences with an obsession with growth in order to get to being the one who had command of the first AI, and didn’t manage to break out of it using that AI, and then got more weird in rapid jumps thanks to the intense things they asked for help with.”
like, the general pattern here being, the crucible of competition tends to beat out of you whatever it was you wanted to compete to get, and suddenly getting a huge windfall of a type you have little experience with that puts you in a new realm of possibility will tend to get massively underused and not end up managing to solve subtle problems.
Nothing like, “oh yeah humanity generally survived and will be kept around indefinitely without significant suffering”.
My main crux here is I think that no strong AI rights will likely be given before near-full alignment to one person is achieved, and maybe not even then, and a lot of the failure modes of giving AIs power in gradual disempowerment scenario fundamentally route through giving AIs very strong rights, but thankfully, this is disincentivized by default, because otherwise AIs would be more expensive.
The main way this changes the scenario is that the 6 humans here remain broadly in control here, and aren’t just high all the time, and the first one probably doesn’t just replace their preferences with pure growth, because at the level of billionaires, status dominates, so they are likely living very rich lives with their own servants.
No guarantees about anyone else surviving though:
No strong AI rights before full alignment: There won’t be a powerful society that gives extremely productive AIs “human-like rights” (and in particular strong property rights) prior to being relatively confident that AIs are aligned to human values.
I think it’s plausible that fully AI-run entities are given the same status as companies—but I expect that the surplus they generate will remain owned by some humans throughout the relevant transition period.
I also think it’s plausible that some weak entities will give AIs these rights, but that this won’t matter because most “AI power” will be controlled by humans that care about it remaining the case as long as we don’t have full alignment.
To be clear, I’m expecting scenarios much more clearly bad than that, like “the universe is almost entirely populated by worker drone AIs and there are like 5 humans who are high all the time and not even in a way they would have signed up for, and then one human who is being copied repeatedly and is starkly superintelligent thanks to boosts from their AI assistants but who had replaced almost all of their preferences with an obsession with growth in order to get to being the one who had command of the first AI, and didn’t manage to break out of it using that AI, and then got more weird in rapid jumps thanks to the intense things they asked for help with.”
like, the general pattern here being, the crucible of competition tends to beat out of you whatever it was you wanted to compete to get, and suddenly getting a huge windfall of a type you have little experience with that puts you in a new realm of possibility will tend to get massively underused and not end up managing to solve subtle problems.
Nothing like, “oh yeah humanity generally survived and will be kept around indefinitely without significant suffering”.
My main crux here is I think that no strong AI rights will likely be given before near-full alignment to one person is achieved, and maybe not even then, and a lot of the failure modes of giving AIs power in gradual disempowerment scenario fundamentally route through giving AIs very strong rights, but thankfully, this is disincentivized by default, because otherwise AIs would be more expensive.
The main way this changes the scenario is that the 6 humans here remain broadly in control here, and aren’t just high all the time, and the first one probably doesn’t just replace their preferences with pure growth, because at the level of billionaires, status dominates, so they are likely living very rich lives with their own servants.
No guarantees about anyone else surviving though: