Yeah, but what does “purposefully minimize someone else’s utility function” mean? The source code just does stuff. What does it mean for it to be “on purpose”?
I believe “on purpose” in this case means, doing something conditional on the other actor’s utility function disvaluing it.
So if you build a interstellar highway through someone’s planet because that is the fastes route, you are not “purposefully minimizing their utility function”, even if they strongly disvalue it.
If you build it through their planet only if they disvalue it and would have build it around if they disvalued that, then you are “purposefully minimizing their utility function”.
If you do so to prevent them from having a planet or to make them react in some (useful to you) way, and would have done so even if they didn’t have disvalued their planet being destroyed, then you are not “purposefully minimizing their utility function”, I think?
Let’s talk about a specific example: the Ultimatum Game. According to EY the rational strategy for the responder in the Ultimatum Game is to accept if the split is “fair” and otherwise reject in proportion to how unfair he thinks the split is. But the only reason to reject is to penalize the proposer for proposing an unfair split—which certainly seems to be “doing something conditional on the other actor’s utility function disvaluing it”. So why is the Ultimatum Game considered an “offer” and not a “threat”?
I can’t tell, if saying that you will reject unfair splits would be a threat by the definition in my above comment.
For it to be a threat, you would have to only do it if the other person cares about the thing being split. But in the Ultimatum Game both players per definition care about it, so I have a hard time thinking about what you would do if someone offers you a unfair split of something they don’t care about (how can a split even be unfair, if only one person values the thing being split?).
Yeah, but what does “purposefully minimize someone else’s utility function” mean? The source code just does stuff. What does it mean for it to be “on purpose”?
I believe “on purpose” in this case means, doing something conditional on the other actor’s utility function disvaluing it.
So if you build a interstellar highway through someone’s planet because that is the fastes route, you are not “purposefully minimizing their utility function”, even if they strongly disvalue it. If you build it through their planet only if they disvalue it and would have build it around if they disvalued that, then you are “purposefully minimizing their utility function”.
If you do so to prevent them from having a planet or to make them react in some (useful to you) way, and would have done so even if they didn’t have disvalued their planet being destroyed, then you are not “purposefully minimizing their utility function”, I think?
Let’s talk about a specific example: the Ultimatum Game. According to EY the rational strategy for the responder in the Ultimatum Game is to accept if the split is “fair” and otherwise reject in proportion to how unfair he thinks the split is. But the only reason to reject is to penalize the proposer for proposing an unfair split—which certainly seems to be “doing something conditional on the other actor’s utility function disvaluing it”. So why is the Ultimatum Game considered an “offer” and not a “threat”?
Good question.
I can’t tell, if saying that you will reject unfair splits would be a threat by the definition in my above comment. For it to be a threat, you would have to only do it if the other person cares about the thing being split. But in the Ultimatum Game both players per definition care about it, so I have a hard time thinking about what you would do if someone offers you a unfair split of something they don’t care about (how can a split even be unfair, if only one person values the thing being split?).