Not really, I do want to make an AGI, primarily because I have very much the want to have a singularity, as it represents hope to me, and I have very different priors than Eliezer or MIRI about how much we’re doomed.
So you think that, since morals are subjective, there is no reason to try to make an effort to control what happens after the singularity? I really don’t see how that follows.
So do you think that instead we should just be trying to not make an AGI at all?
Not really, I do want to make an AGI, primarily because I have very much the want to have a singularity, as it represents hope to me, and I have very different priors than Eliezer or MIRI about how much we’re doomed.
So you think that, since morals are subjective, there is no reason to try to make an effort to control what happens after the singularity? I really don’t see how that follows.