Unbounded Intelligence Lottery

Suppose you’re offered a free ticket for the following lottery: an chance of being uploaded onto a perfect platonic Turing machine (with the understanding that you’ll have full control over the course of computation and ability to self-modify) and a chance of dying immediately. Assume that if you do not participate in the lottery, you will never again have a chance to be uploaded onto a perfect platonic Turing machine. What is the smallest value of , if any, where you’ll participate in the lottery?

Why is this interesting?

An intelligence embedded in a perfect platonic Turing machine would be able to expand and improve itself indefinitely (and arbitrarily quickly, subjectively), without ever running into physical limitations. It could think any computable thought in a subjective instant. It could spend all the steps and memory it wants on simulating fun experiences. It could simulate our universe (?) in order to upload all other humans that have/​will ever live. It could do the same for any sentient aliens. Would this be infinitely better than living for a billion years?