That doesn’t suggest humans are very aligned; rather the opposite. It means that if we have between 2 and 20 AGIs (and those numbers don’t seem unreasonable) between 1 and 10 would choose to destroy humanity.
I think I might have started from a more pessimistic standpoint? It’s more like, I could also imagine living in a world where humans cooperate, but not because they actually care about each other, but would just pretend to do so? Introspection tells me that does not apply to myself, though maybe I evolved to not be conscious of my own selfishness? I am even less sure how altruistic other people are, because I did not ask lots of people: “Would you press a button that annihilates everyone after your death, if in return you get an awesome life?”. On the other hand, cooperation would probably be hard for us in such a world, so this is not as surprising?
I think I might have started from a more pessimistic standpoint? It’s more like, I could also imagine living in a world where humans cooperate, but not because they actually care about each other, but would just pretend to do so? Introspection tells me that does not apply to myself, though maybe I evolved to not be conscious of my own selfishness? I am even less sure how altruistic other people are, because I did not ask lots of people: “Would you press a button that annihilates everyone after your death, if in return you get an awesome life?”. On the other hand, cooperation would probably be hard for us in such a world, so this is not as surprising?