That is pretty much my view as well. The only substantial disagreements I have with Eliezer are the imminence of AGI (I think it’s not imminent at all) and the concept of a “Bayesian” superintelligence (Bayesian reasoning being nothing more than the latest model of thinking to be taken as being the key to the whole thing, the latest in a long line of fizzles).
I think criticism of the OP on the ground of conjunction improbability are unfounded. The components are not independent, and no-one, including the OP, is saying it is all correct in every detail.
ETA: And I’m not any sort of utilitarian. I don’t know what I am, but I know I’m not that, and I don’t feel a pressing need to have a completely thought out position.
That is pretty much my view as well. The only substantial disagreements I have with Eliezer are the imminence of AGI (I think it’s not imminent at all) and the concept of a “Bayesian” superintelligence (Bayesian reasoning being nothing more than the latest model of thinking to be taken as being the key to the whole thing, the latest in a long line of fizzles).
I think criticism of the OP on the ground of conjunction improbability are unfounded. The components are not independent, and no-one, including the OP, is saying it is all correct in every detail.
ETA: And I’m not any sort of utilitarian. I don’t know what I am, but I know I’m not that, and I don’t feel a pressing need to have a completely thought out position.