Wiseman: Yes, that’s a possibility. But even if I only gave MWI a, say, 30% probability of being true, the thought of it being even that likely would continue to bother me. In order to avoid feeling the anguish through that route, I’d need to make myself believe the chance for MWI being true was far lower than what’s rational. In addition to that being against my principles, I’m not sure if it was ethical, either—if MWI really is true, or even if there’s a chance of it being true, then that should influence my behavior somehow, by e.g. avoiding having offspring so there’d at least be less sentients around to experience the horror of MWI (not that I’d probably be having kids pre-Singularity anyway, but that was the first example that came to mind—avoiding situations where I’m in a position to harm somebody else would probably also be good).
Wiseman: Yes, that’s a possibility. But even if I only gave MWI a, say, 30% probability of being true, the thought of it being even that likely would continue to bother me. In order to avoid feeling the anguish through that route, I’d need to make myself believe the chance for MWI being true was far lower than what’s rational. In addition to that being against my principles, I’m not sure if it was ethical, either—if MWI really is true, or even if there’s a chance of it being true, then that should influence my behavior somehow, by e.g. avoiding having offspring so there’d at least be less sentients around to experience the horror of MWI (not that I’d probably be having kids pre-Singularity anyway, but that was the first example that came to mind—avoiding situations where I’m in a position to harm somebody else would probably also be good).
Thanks for trying to help, though.