We’d want to pick something to
have badness per unit of resources (or opportunity cost) only moderately higher than any actually bad thing according to the surrogate,
scale like actually bad things according to the surrogate, and
be extraordinarily unlikely to occur otherwise.
Maybe something like doing some very specific computations, or building very specific objects.
We’d want to pick something to
have badness per unit of resources (or opportunity cost) only moderately higher than any actually bad thing according to the surrogate,
scale like actually bad things according to the surrogate, and
be extraordinarily unlikely to occur otherwise.
Maybe something like doing some very specific computations, or building very specific objects.