My nearby reply is most of my answer here. I know how to tell when reality is off-the-rails wrt to my model, because my model is made of falsifiable parts. I can even tell you about what those parts are, and about the rails I’m expecting reality to stay on.
When I try to cache out your example, “maybe the whole way I’m thinking about bootstrapping isn’t meaningful/useful”, it doesn’t seem like it’s outside my meta-model? I don’t think I have to do anything differently to handle it?
Specifically, my “bootstrapping” concept comes with some concrete pictures of how things go. I currently find the concept “meaningful/useful” because I expect these concrete pictures to be instantiated in reality. (Mostly because I think expect reality to admit the “bootstrapping” I’m picturing, and I expect advanced AI to be able to find it). If reality goes off-my-rails about my concept mattering, it will be because things don’t apply in the way I’m thinking, and there were some other pathways I should have been attending to instead.
My nearby reply is most of my answer here. I know how to tell when reality is off-the-rails wrt to my model, because my model is made of falsifiable parts. I can even tell you about what those parts are, and about the rails I’m expecting reality to stay on.
When I try to cache out your example, “maybe the whole way I’m thinking about bootstrapping isn’t meaningful/useful”, it doesn’t seem like it’s outside my meta-model? I don’t think I have to do anything differently to handle it?
Specifically, my “bootstrapping” concept comes with some concrete pictures of how things go. I currently find the concept “meaningful/useful” because I expect these concrete pictures to be instantiated in reality. (Mostly because I think expect reality to admit the “bootstrapping” I’m picturing, and I expect advanced AI to be able to find it). If reality goes off-my-rails about my concept mattering, it will be because things don’t apply in the way I’m thinking, and there were some other pathways I should have been attending to instead.