you don’t think there are humans whom i can expect to reliably reward-me-as-per-LDT after-the-fact? it doesn’t have to be a certainty, i can merely have some confidence that some person will give me that share, and weigh the action based on that confidence.
That might happen, but they wouldn’t be doing it because they’re maximizing their utility via acausal trade, they’d be doing it because they value reciprocity.
why wouldn’t it be because they’re maximizing their utility via acausal trade?
do you also think people who don’t-intrinsically-value-reciprocity are doomed to never get picked up by rational agents in parfit’s hitchhiker? or doomed to two-box in newcomb?
to take an example: i would expect that even if he didn’t value reciprocity at all, yudkowsky would reliably cooperate as the hitchhiker in parfit’s hitchhiker, or one-box in newcomb, or retroactively-give-utility-function-shares-to-people-who-helped-if-he-grabbed-the-lightcone. he seems like the-kind-of-person-who-tries-to-reliably-implement-LDT.
you don’t think there are humans whom i can expect to reliably reward-me-as-per-LDT after-the-fact? it doesn’t have to be a certainty, i can merely have some confidence that some person will give me that share, and weigh the action based on that confidence.
That might happen, but they wouldn’t be doing it because they’re maximizing their utility via acausal trade, they’d be doing it because they value reciprocity.
why wouldn’t it be because they’re maximizing their utility via acausal trade?
do you also think people who don’t-intrinsically-value-reciprocity are doomed to never get picked up by rational agents in parfit’s hitchhiker? or doomed to two-box in newcomb?
to take an example: i would expect that even if he didn’t value reciprocity at all, yudkowsky would reliably cooperate as the hitchhiker in parfit’s hitchhiker, or one-box in newcomb, or retroactively-give-utility-function-shares-to-people-who-helped-if-he-grabbed-the-lightcone. he seems like the-kind-of-person-who-tries-to-reliably-implement-LDT.