I’ve been trying to think of goals that could be aligned to, that don’t collapse into replicating the same thing a la smiley faces.
One thing I found is that certain goals are self limiting by their implications. Like Robustness. It aims for robustness to shocks and outside influence, but one of the core ways of doing that is to have lots of different things that react differently to shocks that support each other. So there is some desire for diversity built into robustness. Things designed by one entity for robustness might be less robust than lots of things designed by multiple entities with different goals. So a form of goal pluralism might be built in.
Does anyone know of any other interesting goals that have this kind of structure?
I’ve been trying to think of goals that could be aligned to, that don’t collapse into replicating the same thing a la smiley faces.
One thing I found is that certain goals are self limiting by their implications. Like Robustness. It aims for robustness to shocks and outside influence, but one of the core ways of doing that is to have lots of different things that react differently to shocks that support each other. So there is some desire for diversity built into robustness. Things designed by one entity for robustness might be less robust than lots of things designed by multiple entities with different goals. So a form of goal pluralism might be built in.
Does anyone know of any other interesting goals that have this kind of structure?