In situations with multiple copies of an agent or some sort of anthropic uncertainty, asking for probability of being in a certain situation is misleading, the morally real thing is probability of those situations/worlds themselves (which acts as a degree of caring), not probability of being in them. And even that probability depends on your decisions, to the point where your decisions in some situations can make those situations impossible.
In situations with multiple copies of an agent or some sort of anthropic uncertainty, asking for probability of being in a certain situation is misleading, the morally real thing is probability of those situations/worlds themselves (which acts as a degree of caring), not probability of being in them. And even that probability depends on your decisions, to the point where your decisions in some situations can make those situations impossible.
The best argument that we are in a simulation is mine, I think: https://link.springer.com/article/10.1007/s00146-015-0620-9