(As an aside, I agree with Colin Powell that whether or not Obama is a Muslim has no bearing on whether he’s fit to be president.)
Does whether Eliezer is over-confident or not have any bearing on whether he’s fit to work on FAI?
I believe that he’s also had (needless) negative effects on existential risk on account of making strong claims with insufficient evidence. See especially my responses to komponisto’s comment. I may be wrong about this.
From the comment:
My claim is that on average Eliezer’s outlandish claims repel people from thinking about existential risk.
The claim is not credible. I’ve seen a few examples given, but with no way to determine if the people “repelled” would have ever been open to mitigating existential risk in the first place. I suspect anyone who actually cares about existential risk wouldn’t dismiss an idea out of hand because a well-known person working to reduce risk thinks his work is very valuable. It is unlikely to be their true rejection
In any case, I would again emphasize that my most recent posts should not be interpreted as personal attacks on Eliezer.
The latest post made this clear, and cheers for that. But the previous ones are written as attacks on Eliezer. It’s hard to see a diatribe against someone describing them as a cult leader who’s increasing existential risk and would do best to shut up and not interpret it as a personal attack.
But the fact that you seem interested in promoting Eliezer and SIAI independently of whether doing so benefits broader society has led me to greatly discount your claims and suggestions which relate to Eliezer and SIAI.
Fair enough, can’t blame you for that. I’m happy with my enthusiasm.
Does whether Eliezer is over-confident or not have any bearing on whether he’s fit to work on FAI?
Oh, I don’t think so, see my response to Eliezer here.
The claim is not credible. I’ve seen a few examples given, but with no way to determine if the people “repelled” would have ever been open to mitigating existential risk in the first place. I suspect anyone who actually cares about existential risk wouldn’t dismiss an idea out of hand because a well-known person working to reduce risk thinks his work is very valuable. It is unlikely to be their true rejection
Yes, so here it seems like there’s enough ambiguity as to how the publicly available data is properly interpreted so that we may have a legitimate difference of opinion on account of having had different experiences. As Scott Aaronson mentioned in the blogging heads conversation, humans have their information stored in a form (largely subconscious) such that it’s not readily exchanged.
All I would add to what I’ve said is that if you haven’t already done so, see the responses to michaelkeenan’s comment here (in particular those by myself, bentarm and wedrifid).
If you remain unconvinced, we can agree to disagree without hard feelings :-)
Does whether Eliezer is over-confident or not have any bearing on whether he’s fit to work on FAI?
From the comment:
The claim is not credible. I’ve seen a few examples given, but with no way to determine if the people “repelled” would have ever been open to mitigating existential risk in the first place. I suspect anyone who actually cares about existential risk wouldn’t dismiss an idea out of hand because a well-known person working to reduce risk thinks his work is very valuable. It is unlikely to be their true rejection
The latest post made this clear, and cheers for that. But the previous ones are written as attacks on Eliezer. It’s hard to see a diatribe against someone describing them as a cult leader who’s increasing existential risk and would do best to shut up and not interpret it as a personal attack.
Fair enough, can’t blame you for that. I’m happy with my enthusiasm.
Oh, I don’t think so, see my response to Eliezer here.
Yes, so here it seems like there’s enough ambiguity as to how the publicly available data is properly interpreted so that we may have a legitimate difference of opinion on account of having had different experiences. As Scott Aaronson mentioned in the blogging heads conversation, humans have their information stored in a form (largely subconscious) such that it’s not readily exchanged.
All I would add to what I’ve said is that if you haven’t already done so, see the responses to michaelkeenan’s comment here (in particular those by myself, bentarm and wedrifid).
If you remain unconvinced, we can agree to disagree without hard feelings :-)