Eliezer is very much against the idea of supporting MIRI based on a “low probability of really high impact” argument.
I hate to put words in his mouth, but I think
he means 0.0001% chance, not 10% chance. 10% is well within the range of probabilities humans can reason about (to the extent that humans can reason about any probabilities).
Eliezer thinks the case for MIRI does not depend on very small chances, and furthermore, is sceptical that these arguments are used in practice by Xrisk organisations, etc. He wouldn’t necessary turn away someone’s money who said “I’m donating because of a 10^-10 chance.” (though equally he might for PR/paternalistic reasons)
he means 0.0001% chance, not 10% chance. 10% is well within the range of probabilities humans can reason about (to the extent that humans can reason about any probabilities).
I hate to put words in his mouth, but I think
he means 0.0001% chance, not 10% chance. 10% is well within the range of probabilities humans can reason about (to the extent that humans can reason about any probabilities).
Eliezer thinks the case for MIRI does not depend on very small chances, and furthermore, is sceptical that these arguments are used in practice by Xrisk organisations, etc. He wouldn’t necessary turn away someone’s money who said “I’m donating because of a 10^-10 chance.” (though equally he might for PR/paternalistic reasons)
Where does this 10% probability come from?
Anchoring from my butt-number?
I believe the correct term is “ass-pull number.” :)