I think they’re a little different—ontological crises can (I think) be resolved naturally if an agent keeps a bunch of labeled data (or labeled-data-equivalent) around to define things by. But out-of-environment behavior can reflect fundamental limits on extrapolation, to which the only solution is more data, not better agents.
Which is to say, in the case of an ontological crisis I don’t agree that the regular features are missing—they’re just different computations than before.
I think they’re a little different—ontological crises can (I think) be resolved naturally if an agent keeps a bunch of labeled data (or labeled-data-equivalent) around to define things by. But out-of-environment behavior can reflect fundamental limits on extrapolation, to which the only solution is more data, not better agents.
Which is to say, in the case of an ontological crisis I don’t agree that the regular features are missing—they’re just different computations than before.