I share this concern. However if that’s going to be the plan, I think it’s worth making it explicit that alignment to human values is a stepping stone, not the end state.
Aren’t you making this judgement based on your own values? In that case, it seems that an AGI aligned to you specifically is at least as good as an AGI aligned to all sentient life.
Of course, there is a substantial difference between the values of an individual human and human values.
I suppose in that all judgement I make are based on my own values. I’m unclear what point you are trying to make here and how it is relevant to the idea of moral alignment vs. trying to align AI at all.
I am saying that there may be no point to considering moral alignment as target.
We need to solve single to single alignment. At that point, whoever a given AGI is aligned to decides its values. If one of your values resembles moral alignment, great—you want an AGI aligned to you just like many others. Better buy a supercluster ;)
(Just kidding, we don’t know how to solve single to single alignment so please don’t buy a supercluster)
I share this concern. However if that’s going to be the plan, I think it’s worth making it explicit that alignment to human values is a stepping stone, not the end state.
Aren’t you making this judgement based on your own values? In that case, it seems that an AGI aligned to you specifically is at least as good as an AGI aligned to all sentient life.
Of course, there is a substantial difference between the values of an individual human and human values.
I suppose in that all judgement I make are based on my own values. I’m unclear what point you are trying to make here and how it is relevant to the idea of moral alignment vs. trying to align AI at all.
I am saying that there may be no point to considering moral alignment as target.
We need to solve single to single alignment. At that point, whoever a given AGI is aligned to decides its values. If one of your values resembles moral alignment, great—you want an AGI aligned to you just like many others. Better buy a supercluster ;)
(Just kidding, we don’t know how to solve single to single alignment so please don’t buy a supercluster)