Maximal overall utility is better than minimal overall utility. Not sure what that means. The NPCs in this simulation don’t have “utility”. The real humans in the secret prison do.
This should have been clearer. We meant this in Bentham’s good old way: minimal pain and maximal pleasure. Intuitively: A world with a lot of pleasure (in the long run) is better than a world with a lot of pain. - You don’t need to agree, you just need to agree that this is worth considering, but on our interpretation the orthogonality thesis says that one cannot consider this.
but on our interpretation the orthogonality thesis says that one cannot consider this
The orthogonality thesis doesn’t make any claims that agents can’t consider various propositions. Agents can consider whatever propositions, but that doesn’t mean they’ll be moved by them.
Maximal overall utility is better than minimal overall utility. Not sure what that means. The NPCs in this simulation don’t have “utility”. The real humans in the secret prison do.
This should have been clearer. We meant this in Bentham’s good old way: minimal pain and maximal pleasure. Intuitively: A world with a lot of pleasure (in the long run) is better than a world with a lot of pain. - You don’t need to agree, you just need to agree that this is worth considering, but on our interpretation the orthogonality thesis says that one cannot consider this.
The orthogonality thesis doesn’t make any claims that agents can’t consider various propositions. Agents can consider whatever propositions, but that doesn’t mean they’ll be moved by them.