You can of course define probability in a way that doesn’t refer to any specific decision theory, thus making it “independent” of decision theories. But probability is useful exactly as half-of-decision-theory, where you just add “utility” ingredient to get the correct decisions out. This doesn’t work well where indexical uncertainty or mind copying are involved, because “probabilities” you get in those situations (defined in such a way that the resulting decisions are as you’d prefer, as in justification of probability by a bet) depend more on your preference than normally. In simpler situations, maximum entropy at least takes care of situations you don’t terminally distinguish in your values, in a way that is independent on further details of your values.
You can of course define probability in a way that doesn’t refer to any specific decision theory, thus making it “independent” of decision theories. But probability is useful exactly as half-of-decision-theory, where you just add “utility” ingredient to get the correct decisions out. This doesn’t work well where indexical uncertainty or mind copying are involved, because “probabilities” you get in those situations (defined in such a way that the resulting decisions are as you’d prefer, as in justification of probability by a bet) depend more on your preference than normally. In simpler situations, maximum entropy at least takes care of situations you don’t terminally distinguish in your values, in a way that is independent on further details of your values.