I had a hard time understanding the metaphor, and still do. I think it’s a valid complaint. Additionally, I think the term dark side causes more confusion than it resolves
lionrock
When I was 10 years old, I estimated P(my birth religion’s god) at 1/10k. That number has actually gone down (to 1/billion), and it touched 0 at some point because I started believing it DIDN’T make sense to assign probabilities to “impossible” things. P(god) should be higher, as there is either a good or laws or physics. I don’t believe god has much insentive to simulate, or to do anything (intuitively god just does not make sense). So I’d say 1/1000
Psychic: 1/1k
Global warming: I don’t understand the subject AT ALL so I’m going to go with the average here (80%)
Pandemic: There is a chance the population will grow (shrinking the meaning of 1 billion) and engineered pandemic seems likely, so 40%
Mars: 2050 is a bit soon, so 30%
AI: Seems VERY likely, 90%.
You are assuming a simulation does not want to die, and this is unclear. The fact that 100$ is better than 0$ is taken as an axiom because it is part of the statement. However, death is worse than life (for a simulation!) is not trivial. “Rationality should not be done for rationality or it ends in a loop”. So posts use money as the thing the rational agents want. You have to assign a financial value to life before saying it is less valuable than 100$
Perhaps I do not understand the meaning of infohazard, but in this scenario, it seems like what you are trying to avoid is not information, but rather player B knowing you have information. I think this can be solved if you take one of the “Omegas” who can predict you, and then the information itself may be seen as harmful.
I think [insert negative trait here] makes us human is more general