Does a purely rational mind value life less or more?
Specifying that a mind is rational does not specify how much it values life.
That is correct but it is also probably the case that rational mind would propagate better from it’s other values, to the value of it’s own life. For instance if your arm is trapped under boulder, human as is would either be unable to cut off own arm, or do it at suboptimal time (too late), compared to the agent that can propagate everything that it values in the world, to the value of it’s life, and have that huge value win vs the pain. Furthermore, it would correctly propagate pain later (assuming it knows it’ll eventually have to cut off own arm) into the decision now. So it would act as if it values life more and pain less.
Well, one has to distinguish between purely rational being that has full set of possible propositions with correct probabilities assigned to them, and a bounded agent which has a partial set of possible propositions, generated gradually by exploring starting from some of the most probable propositions; the latter can’t even do Bayesian statistics properly due to the complex non-linear feedback via proposition generation process, and is not omniscient enough to foresee as well as your argument requires.