Anyone who makes moral judgements has a Real Moral Something.
But suppose there’s no human-manageable way of predicting your judgements; nothing any simpler or more efficient than presenting them to your brain and seeing what it does. You might not want to call that a system.
And suppose that for some questions, you don’t have an immediate answer, and what answer you end up with depends on irrelevant-seeming details: if we were somehow able to rerun your experience from now to when we ask you the question and you decide on an answer, we would get different answers on different reruns. (This might be difficult to discover, of course.) In that case, you might not want to say that you have a real opinion on those questions, even though it’s possible to induce you to state one.
I’m not sure what it would even mean to not have a Real Moral System. The actual moral judgments must come from somewhere.
Anyone who makes moral judgements has a Real Moral Something.
But suppose there’s no human-manageable way of predicting your judgements; nothing any simpler or more efficient than presenting them to your brain and seeing what it does. You might not want to call that a system.
And suppose that for some questions, you don’t have an immediate answer, and what answer you end up with depends on irrelevant-seeming details: if we were somehow able to rerun your experience from now to when we ask you the question and you decide on an answer, we would get different answers on different reruns. (This might be difficult to discover, of course.) In that case, you might not want to say that you have a real opinion on those questions, even though it’s possible to induce you to state one.
An high-Kolmogorov-complexity system is still a system.