That doesn’t answer the question “Is a fœtus a person”, it just supplies a definition of “person”, which may or may not be relevant to any given query.
Suppose my real query is “Can a fœtus talk?” Now, just because I choose to define “person” in such a way that most “person”s can talk, and in such a way that a fœtus classes as a “person”, that doesn’t make the probability that a fœtus can talk any different to if I’d defined “person” differently.
The whole point of these examples of disguised queries is that if you find yourself trying to answer them, you’re doing it wrong.
Suppose we call the horse’s tail a leg.
For a while this confused me, because I incorrectly identified what part of Eliezer’s argument I thought was wrong.
Suppose I were to make all those observations suggesting that combining two objects with two objects produced three objects. I would not conclude that 2+2=3, rather I would conclude that objects were not modelled by Peano Arithmetic. (This much has been said by other commenters). But then I only ‘know’ Peano Arithmetic through the (physical) operation of my own brain.
Here’s how to convince me that 2+2=3. Suppose I look at the proof from (peano axioms) to (2+2=4), and suddenly notice that an inference has been made that doesn’t follow from the inference rules (say, I notice that the proof says a + (b⁺) = (a+b)⁺ and I know full well that the correct rule is (a⁺)+(b⁺)=(a+b)⁺). I correct this ‘error’ and follow through to the end of the proof, and conclude the result 2+2=3. What do I do? I consider that this observation is more likely if 2+2=3 than if 2+2=4, and so I update on that. It’s still more likely that 2+2=4, because it’s more likely that I’ve made an error this time than that everyone who’s analysed that proof before has made an error (or rather, that I have not heard of anyone else spotting this error). But clearly there is something to update on, so my prior probability that 2+2=3 is not zero. However, I also maintain that if in fact the proof of 2+2=4 is correct, then it remains correct whether or not I am convinced of it, whether or not I exist, and even whether or not physical reality exists. So it is a priori true, but my knowledge of it is not a priori knowledge (because the latter does not exist).
I think this is what Eliezer was trying to say with “Unconditional facts are not the same as unconditional beliefs.”, but this seems to be glossed over and almost lost within the confusion about earplugs. The article’s failure to distinguish between a mathematical theory and a mathematical model (map and territory, possibly?) came very close to obscuring the actual point. This article does not explain how to convince Eliezer that 2+2=3, it explains how to convince Eliezer that PA does not model earplugs—and since the latter is not an a priori truth, it is much less interesting that knowledge of it is not a priori either.