Let me second Psycho’s question. While I personally with my current decision making algorithm would just offer $225,005 to another wedrifid like entity and reject anything worse I am not sure of the conclusions of the formal literature on the subject. I’m sure a lot of intelligent theorists have tackled the question right down to the ground.
Let me second Psycho’s question. While I personally with my current decision making algorithm would just offer $225,005 to another wedrifid like entity and reject anything worse I am not sure of the conclusions of the formal literature on the subject. I’m sure a lot of intelligent theorists have tackled the question right down to the ground.
EDIT: Assuming linear utility for $, obviously.