There are limits even with omniscience—instabilities resolved by quantum mechanics will go both ways, predictably—but you aren’t going to observe both ways. I wouldn’t be surprised if our brains had sufficient instability that quantum mechanics forbids a strong assertion of macro-state after an interval as short as seconds. I’d be surprised if the prediction horizon were milliseconds or hours.
So… we can’t demand that much rent, anyway.
And of course, making beliefs pay rent doesn’t really necessarily make sense in this context. How much use can I get out of knowing that when I was 11 I doodled a pulley system for a platformer game? I can’t remember much else about it, and even if I had it there’s no chance I’d build the game or even offer it to someone who was building a game. Yet, I believe that to be true, because I remember it. I take it you’re not telling me I shouldn’t believe it just because it’s not useful, yet that is exactly the subject of making beliefs pay rent (edited to clarify—whether or not you should believe it, not whether it’s useful).
Being useful is a different issue—it feeds into the question of which beliefs you take an effort to discriminate the truth of. If I came to doubt whether I had in fact designed that pulley system, I would spend approximately 0.00 J of effort attempting to determine whether it’s true or not, even though it’s possible I have the notebook around somewhere.
You’re right about omniscience. Maybe I shouldn’t have mentioned that?
Your pulley system belief isn’t really paying rent by any definition. I don’t see how demanding algorithms from your beliefs changes that.
The point wasn’t that we should not believe things that don’t pay rent (that’s quite hard). It was that we should not stop at simply having a correct formulation of the behavior of something. We should take it as far into the realm of useful algorithms as possible before we say “ok, this belief is paying rent”.
The pulley system belief does pay its rent quite easily—it explains my having a memory of doing so. Any other theory requires further intervention which contains a lot more information than the theory that I simply did it and happened to remember doing it.
That’s what rent is about. It’s not a rent of utility, but of credence.
There are limits even with omniscience—instabilities resolved by quantum mechanics will go both ways, predictably—but you aren’t going to observe both ways. I wouldn’t be surprised if our brains had sufficient instability that quantum mechanics forbids a strong assertion of macro-state after an interval as short as seconds. I’d be surprised if the prediction horizon were milliseconds or hours.
So… we can’t demand that much rent, anyway.
And of course, making beliefs pay rent doesn’t really necessarily make sense in this context. How much use can I get out of knowing that when I was 11 I doodled a pulley system for a platformer game? I can’t remember much else about it, and even if I had it there’s no chance I’d build the game or even offer it to someone who was building a game. Yet, I believe that to be true, because I remember it. I take it you’re not telling me I shouldn’t believe it just because it’s not useful, yet that is exactly the subject of making beliefs pay rent (edited to clarify—whether or not you should believe it, not whether it’s useful).
Being useful is a different issue—it feeds into the question of which beliefs you take an effort to discriminate the truth of. If I came to doubt whether I had in fact designed that pulley system, I would spend approximately 0.00 J of effort attempting to determine whether it’s true or not, even though it’s possible I have the notebook around somewhere.
You’re right about omniscience. Maybe I shouldn’t have mentioned that?
Your pulley system belief isn’t really paying rent by any definition. I don’t see how demanding algorithms from your beliefs changes that.
The point wasn’t that we should not believe things that don’t pay rent (that’s quite hard). It was that we should not stop at simply having a correct formulation of the behavior of something. We should take it as far into the realm of useful algorithms as possible before we say “ok, this belief is paying rent”.
The pulley system belief does pay its rent quite easily—it explains my having a memory of doing so. Any other theory requires further intervention which contains a lot more information than the theory that I simply did it and happened to remember doing it.
That’s what rent is about. It’s not a rent of utility, but of credence.