> Do you think that, absent AI power-seeking, this dynamic is highly likely to lead to human disempowerment? (If so, then i disagree.)
As a sort-of answer, I would just say that I am concerned that people might knowingly and deliberately build power-seeking AIs and hand over power to them, even if we have the means to build AIs that are not power-seeking.
> I said “absent misalignemnt”, and I think your story involves misalignment?
It does not. The point of my story is: “reality can also just be unfriendly to you”. There are trade-offs, and so people optimize for selfish, short-term objectives. You could argue people already do that, but cranking up the optimization power without fixing that seems likely to be bad.
My true objection is more that I think we will see extreme safety/performance trade-offs due to technical inadequacies—ie (roughly) the alignment tax is large (although I don’t like that framing). In that case, you have misalignment despite also having a solution to alignment: competitive pressures prevent people from adopting the solution.
Thanks!
> Do you think that, absent AI power-seeking, this dynamic is highly likely to lead to human disempowerment? (If so, then i disagree.)
As a sort-of answer, I would just say that I am concerned that people might knowingly and deliberately build power-seeking AIs and hand over power to them, even if we have the means to build AIs that are not power-seeking.
> I said “absent misalignemnt”, and I think your story involves misalignment?
It does not. The point of my story is: “reality can also just be unfriendly to you”. There are trade-offs, and so people optimize for selfish, short-term objectives. You could argue people already do that, but cranking up the optimization power without fixing that seems likely to be bad.
My true objection is more that I think we will see extreme safety/performance trade-offs due to technical inadequacies—ie (roughly) the alignment tax is large (although I don’t like that framing). In that case, you have misalignment despite also having a solution to alignment: competitive pressures prevent people from adopting the solution.