the quantum amplitude argument against ethics deduplication

Link post

in experience/​moral patient deduplication and ethics, i explore the question of whether running the same computation of a moral patient twice counts as double, ethically. in all claw, no world i draw up a view of the cosmos based on time steps in the universal machine which suggests that duplicated computations do count as double, because they occupy twice the amount of time-steps in the universal program.

in this post i make another argument, based on preferring one view over another of the (probably correct) many-worlds interpretation of quantum mechanics.

when coming across the concept of many-worlds, i think people most generally assume the view on the left, where new timelines are being created. i think the view on the right, where a constant amount of “reality fluid” or “reality juice” is being split into different timelines, is more correct and makes more sense: we wouldn’t expect the amount of “stuff existing” to keep exponentially growing over time. i believe it also maps to the notion of quantum amplitude.

(where at a given time, A is the amplitude of a particular timeline and ΣA is the sum of amplitudes across all timelines)

i think the way to view this that makes sense, if one is thinking in terms of discrete computation, is that the universe starts out “computing” the same thing in all of many “threads”, and then as timelines branch fractions of these threads start diverging.

this also explains what goes on inside a quantum computer: in the quantum circuit it, rather than saying that a bunch of “new” universes are being temporarily created and then re-merged, instead it’s merely the case that different computation threads are temporarily computing something different instead of the same thing.

(this entails that “entropy control” cannot work, at least not unless some weird “solomonoff deism” simulation hypotheses optimizing away redundanced computation happens.)

if P≠BQP and the universal program is classical, then it’s weird that we inhabit a quantum world — we should be too far inside the universal computation.

if P=BQP or the universal program is quantum, then it makes sense to live in a quantum universe, but:

  • if we adopt the left-side view (more total fluid), then we should observe being at the “end of time” where there’s maximally many timelines — exponentially much of our anthropic juice should be at the maximum quantum entropy, perhaps as boltzmann brains observing anomalously chaotic words. and we don’t observe that!

  • if we adopt the right-side view (fluid gets split), then we get back “regular” anthropics, and everything is normal again: our anthropic juice remains roughly the same as we pass branching events/​macro-scale decoherence.

(one view that completely circumvents all of this is if P≠BQP and the cosmos is, ultimately, implemented classically, but we still only inhabit quantum worlds — perhaps classical worlds simply don’t exist, or the cosmos is really just our big bang and nothing else. in that case, it could be that the classical program taking exponentially long to compute us exponentially far approximately compensates for the time step distribution favoring earlier us’s, possibly exponentially much. that’d be really strange, and it feels like we’d be too far, but i guess it’s possible.)

anyways, what this suggests is that, in the simplest model, the universe is running many computation threads which are originally computing the same thing, and then some fraction of them diverge sometimes — either to re-merge in local situations like quantum computers or the double-slit experiment, or to decohere the rest of the world and more “permanently” split it.

but more importantly, this suggests that:

  • with regards to intrinsic value (rather than eg caring about diversity), duplicating the computation of moral-patient-experience does count as more moral-patient-experience. in deduplication ethics, I≈M≈Q≈V.

  • if we could do it, resimulating the earth in order to bring back everyone has an unethical cost: we’d be rerunning all of history’s suffering.

  • predictablizing ethic deduplication would be a significant change.

  • with regards to quantum immortality: we mustn’t count on it. the fact that we’re strongly duplicated now gets us-now to count a lot more, therefore losing 99% of our quantum amplitude to AI doom would be very bad: we would actually lose existence juice. on the upside this also applies to S-risks: it’s actually helping that they’re small.