I’m getting sick of informal use of complexities. Indexing 3^^^^3 beings in the universe (i mean, somehow listing their addresses) can have complexity greater than 3^^^^3 . If you don’t care for the complexity of indexing then all Kolmogorov’s complexities greater than that of a program which iterates through all programs and runs them for infinite number of steps each (ha, can do that if i choose right language), are equal. That short program produces all the universes, and all the beings, and everything.
Suppose the universe allows for hypercomputers (presumably this have a finite likelihood, and it won’t be proportional to whatever number someone puts in it later on), but it’s hard enough to do that it doesn’t happen naturally. At some point, a sapient species evolves, and a member builds hypercomputer. He simulates a universe on it, in a program called the Matrix. At some point, just for kicks, he contacts someone inside the Matrix and threatens to use his powers from outside the Matrix to kill 3^^^^3 (a number easy to make up) people if they don’t give him five dollars. If they don’t, he writes a program that can create people, and sets it to randomly create and kill 3^^^^3 of them.
Each step of this is unlikely. The unlikelihood multiplies with each successive step. At no point does it even vaguely begin to approach 1/3^^^^3.
I’m getting sick of informal use of complexities. Indexing 3^^^^3 beings in the universe (i mean, somehow listing their addresses) can have complexity greater than 3^^^^3 . If you don’t care for the complexity of indexing then all Kolmogorov’s complexities greater than that of a program which iterates through all programs and runs them for infinite number of steps each (ha, can do that if i choose right language), are equal. That short program produces all the universes, and all the beings, and everything.
Suppose the universe allows for hypercomputers (presumably this have a finite likelihood, and it won’t be proportional to whatever number someone puts in it later on), but it’s hard enough to do that it doesn’t happen naturally. At some point, a sapient species evolves, and a member builds hypercomputer. He simulates a universe on it, in a program called the Matrix. At some point, just for kicks, he contacts someone inside the Matrix and threatens to use his powers from outside the Matrix to kill 3^^^^3 (a number easy to make up) people if they don’t give him five dollars. If they don’t, he writes a program that can create people, and sets it to randomly create and kill 3^^^^3 of them.
Each step of this is unlikely. The unlikelihood multiplies with each successive step. At no point does it even vaguely begin to approach 1/3^^^^3.