You are more optimistic than I that our current AIs will care enough to spend 1/billionth their resources on keeping us alive.
You are separately more optimistic than I, that one could expect the high bidders for trading “we saved the humans” to care about not merely out well being but our agency.
It seems like those maybe share a crux at how natural niceness is (which is… not exactly doublecounting, but, if you were to change your mind about that, probably both of those numbers drop. Is that right?)
Yes, there is an underlying correlation. E.g., if I thought that humans on reflection wouldn’t care at all about bailing out other humans and satisfying their preferences to remain physically alive this would be evidence on both trade and about AIs.
It occurs to me:
You are more optimistic than I that our current AIs will care enough to spend 1/billionth their resources on keeping us alive.
You are separately more optimistic than I, that one could expect the high bidders for trading “we saved the humans” to care about not merely out well being but our agency.
It seems like those maybe share a crux at how natural niceness is (which is… not exactly doublecounting, but, if you were to change your mind about that, probably both of those numbers drop. Is that right?)
Yes, there is an underlying correlation. E.g., if I thought that humans on reflection wouldn’t care at all about bailing out other humans and satisfying their preferences to remain physically alive this would be evidence on both trade and about AIs.