What’s the prior probability? Probably somewhere above 1/3^^^^3
Then you got to give money to random person who didn’t even pascal-mug you, too.
I don’t see why it would be somewhere above 1/3^^^^3 . The f(N) got to sum to 1 (or less than 1). In any case imo the issue with Pascal’s mugging is that if we are told something we automatically assign it much, much higher probability than we assigned before, as a sort of reflex.
For someone who never been told of Pascal’s mugging before, the f(N) is probably really small, something along the lines of whole of it summing to zero as evident by them not shouting ‘take my money please and don’t torture 3^^^^3 people’ at random strangers. Then when we are told something we have to assign some nonzero prior probability, and there we fail because there may not be enough information for sensible prior.
I don’t see why it would be somewhere above 1/3^^^^3.
Because it’s a round number. The universe essentially is a program. Just as a much shorter program could be written to output 3^^^^3 than a similarly large number, a much simpler universe can kill 3^^^^3 people than a similarly large number.
If I typed a computer program at random, and it was long enough that it could output 3^^^^3, and I calculated the expected output given that it actually stops, then it would be dominated by such large numbers.
In the case that he threatens you with a random number of similar magnitude, it works out differently. In this case, the prior is crazy low, but he has a similarly low chance of giving that exact number, so the probability goes way up.
Then you got to give money to random person who didn’t even pascal-mug you, too.
Before it gets multiplied, it’s about as likely that he would kill the people if I don’t give him money.
It’s not exactly as likely, so my choices would still be dominated by things like this. Except, as I already stated, I have the even bigger problem of not even being able to calculate expected value.
Eliezer didn’t tell us this with the intention of convincing us to submit to muggers. We all know it gives insane results. The problem is coming up with and justifying a decision method that doesn’t give insane results for this reason.
For someone who never been told of Pascal’s mugging before, the f(N) is probably really small, something along the lines of whole of it summing to zero as evident by them not shouting ‘take my money please and don’t torture 3^^^^3 people’ at random strangers.
Perhaps they’re not acting rationally. For one thing, they’re not logically omniscient. Maybe they didn’t think of it.
Because it’s a round number. The universe essentially is a program. Just as a much shorter program could be written to output 3^^^^3 than a similarly large number, a much simpler universe can kill 3^^^^3 people than a similarly large number.
I doubt it. Let’s suppose our universe is close to being the simplest universe. How likely it is that our universe tortures a ‘round number’ of beings? The number of beings that our universe (or quantum many world universe) tortures, is probably only encoded in a short way by simulating the universe and counting. That’s probably the roundest way to express it. You would need some very bizarre and complex laws of physics to make the universe produce a number of tortured beings which is round in more than one way.
Meanwhile a made up number (or a number that is a product of faulty reasoning) is much more likely to be round.
You would need some very bizarre and complex laws of physics
Very bizarre compared to what we have. Not very bizarre compared to how big a number 3^^^^3 is. Once you have those basic laws, you can make it as big a number as you want.
The mugger may have the ability to kill an arbitrary number of people. If he does, he can kill 3^^^^3 people as easily as he can express it. Him killing that many people and him making that number up will be similar in likelihood.
The point is that our universe already codes some specific number of beings that are tortured, without stacking on some extra laws. That specific number would look utterly random to anyone who’s not simulating. Furthermore, the issue with informal use of complexities is… consider simple short program that iterates over every program and runs it for 3^^^^3 steps. Now, this includes stuff that you would deem to have high complexity, somewhere along the road.
The point is that our universe already codes some specific number of beings that are tortured, without stacking on some extra laws.
It doesn’t matter what our universe does. What matters is what some universe does that has a probability well over 1/3^^^^3. Stacking on extra laws can get you a universe like that, without lowering its probability that much.
I’m getting sick of informal use of complexities. Indexing 3^^^^3 beings in the universe (i mean, somehow listing their addresses) can have complexity greater than 3^^^^3 . If you don’t care for the complexity of indexing then all Kolmogorov’s complexities greater than that of a program which iterates through all programs and runs them for infinite number of steps each (ha, can do that if i choose right language), are equal. That short program produces all the universes, and all the beings, and everything.
Suppose the universe allows for hypercomputers (presumably this have a finite likelihood, and it won’t be proportional to whatever number someone puts in it later on), but it’s hard enough to do that it doesn’t happen naturally. At some point, a sapient species evolves, and a member builds hypercomputer. He simulates a universe on it, in a program called the Matrix. At some point, just for kicks, he contacts someone inside the Matrix and threatens to use his powers from outside the Matrix to kill 3^^^^3 (a number easy to make up) people if they don’t give him five dollars. If they don’t, he writes a program that can create people, and sets it to randomly create and kill 3^^^^3 of them.
Each step of this is unlikely. The unlikelihood multiplies with each successive step. At no point does it even vaguely begin to approach 1/3^^^^3.
Then you got to give money to random person who didn’t even pascal-mug you, too.
I don’t see why it would be somewhere above 1/3^^^^3 . The f(N) got to sum to 1 (or less than 1). In any case imo the issue with Pascal’s mugging is that if we are told something we automatically assign it much, much higher probability than we assigned before, as a sort of reflex.
For someone who never been told of Pascal’s mugging before, the f(N) is probably really small, something along the lines of whole of it summing to zero as evident by them not shouting ‘take my money please and don’t torture 3^^^^3 people’ at random strangers. Then when we are told something we have to assign some nonzero prior probability, and there we fail because there may not be enough information for sensible prior.
Because it’s a round number. The universe essentially is a program. Just as a much shorter program could be written to output 3^^^^3 than a similarly large number, a much simpler universe can kill 3^^^^3 people than a similarly large number.
If I typed a computer program at random, and it was long enough that it could output 3^^^^3, and I calculated the expected output given that it actually stops, then it would be dominated by such large numbers.
In the case that he threatens you with a random number of similar magnitude, it works out differently. In this case, the prior is crazy low, but he has a similarly low chance of giving that exact number, so the probability goes way up.
Before it gets multiplied, it’s about as likely that he would kill the people if I don’t give him money.
It’s not exactly as likely, so my choices would still be dominated by things like this. Except, as I already stated, I have the even bigger problem of not even being able to calculate expected value.
Eliezer didn’t tell us this with the intention of convincing us to submit to muggers. We all know it gives insane results. The problem is coming up with and justifying a decision method that doesn’t give insane results for this reason.
Perhaps they’re not acting rationally. For one thing, they’re not logically omniscient. Maybe they didn’t think of it.
I doubt it. Let’s suppose our universe is close to being the simplest universe. How likely it is that our universe tortures a ‘round number’ of beings? The number of beings that our universe (or quantum many world universe) tortures, is probably only encoded in a short way by simulating the universe and counting. That’s probably the roundest way to express it. You would need some very bizarre and complex laws of physics to make the universe produce a number of tortured beings which is round in more than one way.
Meanwhile a made up number (or a number that is a product of faulty reasoning) is much more likely to be round.
Very bizarre compared to what we have. Not very bizarre compared to how big a number 3^^^^3 is. Once you have those basic laws, you can make it as big a number as you want.
The mugger may have the ability to kill an arbitrary number of people. If he does, he can kill 3^^^^3 people as easily as he can express it. Him killing that many people and him making that number up will be similar in likelihood.
The point is that our universe already codes some specific number of beings that are tortured, without stacking on some extra laws. That specific number would look utterly random to anyone who’s not simulating. Furthermore, the issue with informal use of complexities is… consider simple short program that iterates over every program and runs it for 3^^^^3 steps. Now, this includes stuff that you would deem to have high complexity, somewhere along the road.
It doesn’t matter what our universe does. What matters is what some universe does that has a probability well over 1/3^^^^3. Stacking on extra laws can get you a universe like that, without lowering its probability that much.
I’m getting sick of informal use of complexities. Indexing 3^^^^3 beings in the universe (i mean, somehow listing their addresses) can have complexity greater than 3^^^^3 . If you don’t care for the complexity of indexing then all Kolmogorov’s complexities greater than that of a program which iterates through all programs and runs them for infinite number of steps each (ha, can do that if i choose right language), are equal. That short program produces all the universes, and all the beings, and everything.
Suppose the universe allows for hypercomputers (presumably this have a finite likelihood, and it won’t be proportional to whatever number someone puts in it later on), but it’s hard enough to do that it doesn’t happen naturally. At some point, a sapient species evolves, and a member builds hypercomputer. He simulates a universe on it, in a program called the Matrix. At some point, just for kicks, he contacts someone inside the Matrix and threatens to use his powers from outside the Matrix to kill 3^^^^3 (a number easy to make up) people if they don’t give him five dollars. If they don’t, he writes a program that can create people, and sets it to randomly create and kill 3^^^^3 of them.
Each step of this is unlikely. The unlikelihood multiplies with each successive step. At no point does it even vaguely begin to approach 1/3^^^^3.