Are these things under-defined, or “just” outside of our heuristic/intuition domain? I’d argue that reality is perfectly defined—any obtained configuration of the universe is a very precise thing. And this extends to obtainable/potential configurations (where there’s uncertainty over which will actually occur, but we have no modeled data that contradicts the possibility).
So the situation is defined, what part isn’t? I think it’s our moral intuition about whether a given situation is “better” than another. I don’t have a solution to this, but I want to be clear that the root problem is trying to “align” to non-universal, often unknown, preferences.
Are these things under-defined, or “just” outside of our heuristic/intuition domain? I’d argue that reality is perfectly defined—any obtained configuration of the universe is a very precise thing. And this extends to obtainable/potential configurations (where there’s uncertainty over which will actually occur, but we have no modeled data that contradicts the possibility).
So the situation is defined, what part isn’t? I think it’s our moral intuition about whether a given situation is “better” than another. I don’t have a solution to this, but I want to be clear that the root problem is trying to “align” to non-universal, often unknown, preferences.
It is indeed our moral intuitions that are underdefined, not the states of the universe.