Pretty sure the unpacking goes like “I think it is 20% likely that a moral theory is ‘true’ (I’m interpreting ‘true’ as “what I would agree on after perfect information and time to grow and reflect”) in which hurting cows is as morally bad as hurting humans.”
Pretty sure the unpacking goes like “I think it is 20% likely that a moral theory is ‘true’ (I’m interpreting ‘true’ as “what I would agree on after perfect information and time to grow and reflect”) in which hurting cows is as morally bad as hurting humans.”
Right, sure. but does it not follow that, if you average over all possible worlds, 5 cows have the same moral worth as 1 human?