Don’t we feel gratitude and warmth and empathy and care-for-the-monkey’s-values such that we’re willing to make small sacrifices on their behalf?
People do make small sacrifices on behalf of monkeys? Like >1 / billion of human resources are spent on doing things for monkeys (this is just >$100k per year). And, in the case of AI takeover, 1 / billion could easily suffice to avoid literal human extinction (with some chance of avoiding mass fatalities due to AI takeover). This isn’t to say that after AI takeover humans would have much control over the future or that the situation wouldn’t be very bad on my views (or on the views of most people at least on reflection). Like, even if some (or most/all) humans survive it’s still an x-risk if we lose control over the longer run future.
Like I agree with the claim that people care very little about the interests of monkeys and don’t let them slow them down in the slightest. But, the exact amount of caring humans exhibit probably would suffice for avoiding literal extinction in the case of AIs.
I think your response is “sure, but AIs won’t care at all”:
You have to be careful with the metaphor, because it can lead people to erroneously assuming that an AI would be at least that nice, which is not at all obvious or likely for various reasons (that you can read about in the book when it comes out in September!).
Agree that it’s not obvious and I think I tenatively expect AIs that takeover are less “nice” in this way than humans are. But, I think it’s pretty likely (40%?) they are “nice” enough to care about humans some tiny amount that suffices for avoiding extinction (while also not having specific desires about what to do with humans that interfere with this) and there is also the possibility of (acausal) trade resulting in human survival. In aggregate, I think these make extinction less likely than not. (But don’t mean that the value of the future isn’t (mostly) lost.)
People do make small sacrifices on behalf of monkeys? Like >1 / billion of human resources are spent on doing things for monkeys (this is just >$100k per year). And, in the case of AI takeover, 1 / billion could easily suffice to avoid literal human extinction (with some chance of avoiding mass fatalities due to AI takeover). This isn’t to say that after AI takeover humans would have much control over the future or that the situation wouldn’t be very bad on my views (or on the views of most people at least on reflection). Like, even if some (or most/all) humans survive it’s still an x-risk if we lose control over the longer run future.
Like I agree with the claim that people care very little about the interests of monkeys and don’t let them slow them down in the slightest. But, the exact amount of caring humans exhibit probably would suffice for avoiding literal extinction in the case of AIs.
I think your response is “sure, but AIs won’t care at all”:
Agree that it’s not obvious and I think I tenatively expect AIs that takeover are less “nice” in this way than humans are. But, I think it’s pretty likely (40%?) they are “nice” enough to care about humans some tiny amount that suffices for avoiding extinction (while also not having specific desires about what to do with humans that interfere with this) and there is also the possibility of (acausal) trade resulting in human survival. In aggregate, I think these make extinction less likely than not. (But don’t mean that the value of the future isn’t (mostly) lost.)