Given your answers to 1-3, you should spend all of your altruistic efforts on mitigating x-risk (unless you’re just trying to feel good, entertain yourself, etc.).
For 4, I shouldn’t have asked you whether you “think” something beats negotiating a positive singularity in terms of x-risk reduction. Better: Is there some other fairly natural class of interventions (or list of potential examples) such that, given your credences, has a higher expected value? What might such things be?
For 5-6, perhaps you should think about what such organizations might be. Those interested in convincing XiXiDu might try listing some alternative best x-risk mitigating groups and provide arguments that they don’t do as well. As for me, my credences are highly unstable in this area, so info is appreciated on my part as well.
Given your answers to 1-3, you should spend all of your altruistic efforts on mitigating x-risk (unless you’re just trying to feel good, entertain yourself, etc.).
For 4, I shouldn’t have asked you whether you “think” something beats negotiating a positive singularity in terms of x-risk reduction. Better: Is there some other fairly natural class of interventions (or list of potential examples) such that, given your credences, has a higher expected value? What might such things be?
For 5-6, perhaps you should think about what such organizations might be. Those interested in convincing XiXiDu might try listing some alternative best x-risk mitigating groups and provide arguments that they don’t do as well. As for me, my credences are highly unstable in this area, so info is appreciated on my part as well.