My biggest problem is more that he talks about it, sometimes in semiofficial
channels. This doesn’t mean that I wouldn’t be squicked out if I learned about
it, but I wouldn’t see it as a political problem for the SIAI.
The SIAI isn’t some random research think tank: it presents itself as the
charity with the highest utility per marginal dollar. Likewise, Eliezer
Yudkowsky isn’t some random anonymous researcher: he is the public face of the
SIAI. His actions and public behavior reflect on the SIAI whether or not it’s
fair, and everyone involved should have already had that as a strongly held
prior.
If people ignore lesswrong or don’t donate to the SIAI because they’re filtered
out by squickish feelings, then this is less resources for the SIAI’s mission
in return for inconsequential short term gains realized mostly by SIAI
insiders. Compound this that talking about the singularity already triggers
some people’s absurdity bias; there needs to be as few other filters as
possible to maximize usable resources that the SIAI has to maximize the chance
of positive singularity outcomes.
It seems there are two problems: you trust SIAI less, and you worry that others will trust it less. I understand the reason for the second worry, but not the first. Is it that you worry your investment will become worth less because others won’t want to fund SIAI?
That talk was very strong evidence that the SI is incompetent at PR, and furthermore, irrational. edit: or doesn’t possess stated goals and beliefs. If you believe the donations are important for saving your life (along with everyone else’s), then you naturally try to avoid making such statements. Though I do in some way admire straight up in your face honesty.
My biggest problem is more that he talks about it, sometimes in semiofficial channels. This doesn’t mean that I wouldn’t be squicked out if I learned about it, but I wouldn’t see it as a political problem for the SIAI.
The SIAI isn’t some random research think tank: it presents itself as the charity with the highest utility per marginal dollar. Likewise, Eliezer Yudkowsky isn’t some random anonymous researcher: he is the public face of the SIAI. His actions and public behavior reflect on the SIAI whether or not it’s fair, and everyone involved should have already had that as a strongly held prior.
If people ignore lesswrong or don’t donate to the SIAI because they’re filtered out by squickish feelings, then this is less resources for the SIAI’s mission in return for inconsequential short term gains realized mostly by SIAI insiders. Compound this that talking about the singularity already triggers some people’s absurdity bias; there needs to be as few other filters as possible to maximize usable resources that the SIAI has to maximize the chance of positive singularity outcomes.
It seems there are two problems: you trust SIAI less, and you worry that others will trust it less. I understand the reason for the second worry, but not the first. Is it that you worry your investment will become worth less because others won’t want to fund SIAI?
That talk was very strong evidence that the SI is incompetent at PR, and furthermore, irrational. edit: or doesn’t possess stated goals and beliefs. If you believe the donations are important for saving your life (along with everyone else’s), then you naturally try to avoid making such statements. Though I do in some way admire straight up in your face honesty.