Ben, my FAI-coding instincts at the time were pretty lousy. The concept does not appeal to my modern instinct; and calling the instinct I had back then an “FAI-coding” one is praising it too highly.
Tyrrell, the distinction to which I refer, is the role that “Because I like walnuts over almonds” plays in my justification for choosing rocky road, and presumably your motive for convincing me thereof if you’re an altruist. We can see the presence of this implicit justification, whether or not it is mentioned, by asking the following moral question: “If-counterfactual I came to personally prefer almonds over walnuts, would it be right for me to choose praline over rocky road?” The answer, “Yes”, reveals that that there is an explicit, quoted, justificational dependency, in the moral computation, on my state of mind and preference.
This is not to be confused with a physical causal dependency of my output on my brain, which always exists, even for the calculator that asks only “What is 2 + 3?” The calculator’s output depends on its transistors, but it has not asked a personal-preference-dependent question.
Ben, my FAI-coding instincts at the time were pretty lousy. The concept does not appeal to my modern instinct; and calling the instinct I had back then an “FAI-coding” one is praising it too highly.
Tyrrell, the distinction to which I refer, is the role that “Because I like walnuts over almonds” plays in my justification for choosing rocky road, and presumably your motive for convincing me thereof if you’re an altruist. We can see the presence of this implicit justification, whether or not it is mentioned, by asking the following moral question: “If-counterfactual I came to personally prefer almonds over walnuts, would it be right for me to choose praline over rocky road?” The answer, “Yes”, reveals that that there is an explicit, quoted, justificational dependency, in the moral computation, on my state of mind and preference.
This is not to be confused with a physical causal dependency of my output on my brain, which always exists, even for the calculator that asks only “What is 2 + 3?” The calculator’s output depends on its transistors, but it has not asked a personal-preference-dependent question.