“Most attention” is ambiguous, particularly when some of the options are phrased as proactive and others reactive/preventative. Do you man funding? Public awareness? Plus there some issues might be incredibly important but require relatively little “attention” to solve while others might be less important but take a lot more resources to solve. I wouldn’t know how to answer this question accept to say I don’t think any effort should be spent on creating and deploying anti- grey goo nanotech.
I think we mean here by existential risks something alone the lines of, in Bostrom’s words ” - - either annihilate Earth-originating intelligent life or drastically and permanently curtail its potential”, making countries irrelevant.
Select the Existential Risk you judge most likely to occur this century?
Nuclear holocaust
Badly programmed superintelligence
Genetically engineered biological agent
Accidental misuse of nanotechnology (“gray goo”)
Environmental catastrophe (eg runaway global warming)
etc
Why not ask for probabilities for each, and confidence intervals as well?
Related, but different: Which of these world-saving causes should receive most attention? (Maybe place these in order.)
Avoiding nuclear war
Create a Friendly AI, including prevention of creating AIs you don’t think are Friendly
Create AI, no need to be Friendly.
Prevent creation of AIs until humans are a lot smarter
Improve human cognition(should this include uploading capabilities?)
Defense against biological agents
Delay nanotechnology development until we have sufficiently powerful AIs to set up defenses against gray goo
Creation and deployment of anti- gray goo nanotechnology
Avoiding environmental hazards
Space colonization
Fighting diseases
Fighting aging
something else?
“Most attention” is ambiguous, particularly when some of the options are phrased as proactive and others reactive/preventative. Do you man funding? Public awareness? Plus there some issues might be incredibly important but require relatively little “attention” to solve while others might be less important but take a lot more resources to solve. I wouldn’t know how to answer this question accept to say I don’t think any effort should be spent on creating and deploying anti- grey goo nanotech.
You must also ask country of residence for this to be valid.
I think we mean here by existential risks something alone the lines of, in Bostrom’s words ” - - either annihilate Earth-originating intelligent life or drastically and permanently curtail its potential”, making countries irrelevant.
Oops, I misread “century” as “country”.