I continue to have difficulty accepting that the millionth bit of pi is just as good a random bit source as a coin flip. I am picturing a mathematically inexperienced programmer writing a (pseudo)random bit-generating routine that calculated the millionth digit of pi and returned it. Could they justify their code by pointing out that they don’t know what the millionth digit of pi is, and so they can treat it as a random bit?
Seriously: You have no reason to believe that the millionth bit of pi goes one way or the other, so you should assign equal probability to each.
However, just like the xkcd example would work better if the computer actually rolled the die for you every time rather than just returning ‘4’, the ‘millionth bit of pi’ algorithm doesn’t work well because it only generates a random bit once (amongst other practical problems).
In most pseudorandom generators, you can specify a ‘seed’ which will get you a fixed set of outputs; thus, you could every time restart the generator with the seed that will output ‘4’ and get ‘4’ out of it deterministically. This does not undermine its ability to be a random number generator. One common way to seed a random number generator is to simply feed it the current time, since that’s as good as random.
Looking back, I’m not certain if I’ve answered the question.
Looking back, I’m not certain if I’ve answered the question.
I think so: I’m inferring from your comment that the principle of indifference is a rationale for treating a deterministic-but-unknown quantity as a random variable. Which I can’t argue with, but it still clashes with my intuition that any casino using the millionth bit of pi as its PRNG should expect to lose a lot of money.
I agree with your point on arbitrary seeding, for whatever it’s worth. Selecting an arbitrary bit of pi at random to use as a random bit amounts to a coin flip.
I am picturing a mathematically inexperienced programmer writing a (pseudo)random bit-generating routine that calculated the millionth digit of pi and returned it.
I’d be extremely impressed if a mathematically inexperienced programmer could pull of a program that calculated the millionth digit of pi!
Could they justify their code by pointing out that they don’t know what the millionth digit of pi is, and so they can treat it as a random bit?
I say yes (assuming they only plan on treating it as a random bit once!)
Not seriously: http://www.xkcd.com/221/
Seriously: You have no reason to believe that the millionth bit of pi goes one way or the other, so you should assign equal probability to each.
However, just like the xkcd example would work better if the computer actually rolled the die for you every time rather than just returning ‘4’, the ‘millionth bit of pi’ algorithm doesn’t work well because it only generates a random bit once (amongst other practical problems).
In most pseudorandom generators, you can specify a ‘seed’ which will get you a fixed set of outputs; thus, you could every time restart the generator with the seed that will output ‘4’ and get ‘4’ out of it deterministically. This does not undermine its ability to be a random number generator. One common way to seed a random number generator is to simply feed it the current time, since that’s as good as random.
Looking back, I’m not certain if I’ve answered the question.
I think so: I’m inferring from your comment that the principle of indifference is a rationale for treating a deterministic-but-unknown quantity as a random variable. Which I can’t argue with, but it still clashes with my intuition that any casino using the millionth bit of pi as its PRNG should expect to lose a lot of money.
I agree with your point on arbitrary seeding, for whatever it’s worth. Selecting an arbitrary bit of pi at random to use as a random bit amounts to a coin flip.
I’d be extremely impressed if a mathematically inexperienced programmer could pull of a program that calculated the millionth digit of pi!
I say yes (assuming they only plan on treating it as a random bit once!)