(Edit: It looks like the downvotes have been reversed, and the post may even be somewhat over-voted now. Thanks to whoever reversed the downvotes. I’m still curious why the karma fluctuated so violently.)
I’d appreciate knowing why this post has been downvoted. If you’ve downvoted this question post, I would be grateful if you could explain why. Please don’t downvote it to below 0 unless you have an explanation. I say this partly because it is a question, so it’s important to me that it is actually seen! Further, I struggle to understand how a question like this could be objectionable.
While I didn’t downvote it, I have a potential explanation. I think that the ability to acausally communicate with other universes is either absent[1] or contradicts most humans’ intuitions. As far as I understand acausal trade (e.g. coordination in The True One-Shot Prisoner’s Dilemma)[2], it is based on the assumption that the other participant will thinklike us once it actually encounters the dilemma.
Additionally, the line about “theorems which say that the more complex minds will always output the same information as the simpler ones, all else (including their inputs, which is to say there sense-data) being equal” reminds me of Yudkowsky’s case against Universally Compelling Arguments.
However, @Wei Dai’s updateless DT could end up prescribing various hard-to-endorse acausal deals. See, e.g. his case for the possibility of superastronomical waste.
Unlike this one-shot dilemma, the iterated dilemma is likely to provide agents with the ability to coordinate by evolution alone with no intrinsic reasoning. I prepared a draft on the issue.
Hello again Stanislav, thanks for your comment. “I think that the ability to acausally communicate with other universes is either absent …” On this point, that’s exactly why I made this a question post; I was hoping people would explain why they agreed/disagreed with the notion that acausal communication is possible. I have the same understanding as you of acausal trade. Can you say more about the hypothetical theorems? Why does this remind you of No Universally Compelling Arguments ? I have a guess, but I would prefer to know exactly what you mean. (Comment Edited for brevity.)
(Edit: It looks like the downvotes have been reversed, and the post may even be somewhat over-voted now. Thanks to whoever reversed the downvotes. I’m still curious why the karma fluctuated so violently.)
I’d appreciate knowing why this post has been downvoted. If you’ve downvoted this question post, I would be grateful if you could explain why. Please don’t downvote it to below 0 unless you have an explanation. I say this partly because it is a question, so it’s important to me that it is actually seen! Further, I struggle to understand how a question like this could be objectionable.
While I didn’t downvote it, I have a potential explanation. I think that the ability to acausally communicate with other universes is either absent[1] or contradicts most humans’ intuitions. As far as I understand acausal trade (e.g. coordination in The True One-Shot Prisoner’s Dilemma)[2], it is based on the assumption that the other participant will think like us once it actually encounters the dilemma.
Additionally, the line about “theorems which say that the more complex minds will always output the same information as the simpler ones, all else (including their inputs, which is to say there sense-data) being equal” reminds me of Yudkowsky’s case against Universally Compelling Arguments.
However, @Wei Dai’s updateless DT could end up prescribing various hard-to-endorse acausal deals. See, e.g. his case for the possibility of superastronomical waste.
Unlike this one-shot dilemma, the iterated dilemma is likely to provide agents with the ability to coordinate by evolution alone with no intrinsic reasoning. I prepared a draft on the issue.
Hello again Stanislav, thanks for your comment. “I think that the ability to acausally communicate with other universes is either absent …” On this point, that’s exactly why I made this a question post; I was hoping people would explain why they agreed/disagreed with the notion that acausal communication is possible. I have the same understanding as you of acausal trade. Can you say more about the hypothetical theorems? Why does this remind you of No Universally Compelling Arguments ? I have a guess, but I would prefer to know exactly what you mean. (Comment Edited for brevity.)