Negative reactions to Yudkowsky from various people (academics concerned with x-risk), just within the past few weeks:
I also have an extreme distaste for Eliezer Yudkowsky, and so I have a hard time forcing myself to cooperate with any organization that he is included in, but that is a personal matter.
You know, maybe I’m not all that
interested in any sort of relationship with SIAI after all if this, and
Yudkowsky, are the best you have to offer.
...
There are certainly many reasons to doubt the belief system of a cult based around the haphazard musings of a high school dropout, who has never written a single computer program but professes to be an expert on AI. As you point out none of the real AI experts are crying chicken little, and only a handful of AI researchers, cognitive scientists or philosophers take the FAI idea seriously.
…
Wow, that’s an incredibly arrogant put-down by Eliezer..SIAI won’t win many friends if he puts things like that...
...
...he seems to have lost his mind and written out of strong feelings. I disagree with him on most of these matters.
…
Questions of priority—and the relative intensity of suffering between members of different species—need to be distinguished from the question of whether other sentient beings have moral status at all. I guess that was what shocked me about Eliezer’s bald assertion that frogs have no moral status. After all, humans may be less sentient than frogs compared to our posthuman successors. So it’s unsettling to think that posthumans might give simple-minded humans the same level of moral consideration that Elizeer accords frogs.
I was told that the quotes above state some ad hominem falsehoods regarding Eliezer. I think it is appropriate to edit the message to show that indeed some person might not be have been honest, or clueful. Otherwise I’ll unnecessary end up perpetuating possible ad hominem attacks.
utterly false, wrote my first one at age 5 or 6, in BASIC on a ZX-81 with 4K of RAM
The fact that a lot of these reactions are based on false info is worth noting. It doesn’t defeat any arguments directly, but it says that the naive model where everything happens because of the direct perception of actions I directly control is false.
I don’t like to, but if necessary I can provide the indentity of the people who stated the above. They all directly work to reduce x-risks. I won’t do so in public however.
Identity of these people is not the issue. The percentage of people in given category that have negative reactions for given reason, negative reactions for other reason, and positive reactions would be useful, but not a bunch of filtered (in unknown way) soldier-arguments.
I know. I however just wanted to highlight that there are negative reactions, including not so negative critique. If you look further, you’ll probably find more. I haven’t saved all I saw over the years, I just wanted to show that it’s not like nobody has a problem with EY. And in all ocassion I actually defended him by the way.
The context is also difficult to provide as some of it is from private e-Mails. Although the first one is from here and after thinking about it I can also provide the name since he was anyway telling this Michael Anissimov. It is from Sean Hays:
Sean A Hays PhD
Post Doctoral Fellow, Center for Nanotechnology in Society at ASU
Research Associate, ASU-NAF-Slate Magazine “Future Tense” Initiative
Program Director, IEET Securing the Future Program
Negative reactions to Yudkowsky from various people (academics concerned with x-risk), just within the past few weeks:
...
…
...
…
I was told that the quotes above state some ad hominem falsehoods regarding Eliezer. I think it is appropriate to edit the message to show that indeed some person might not be have been honest, or clueful. Otherwise I’ll unnecessary end up perpetuating possible ad hominem attacks.
utterly false, wrote my first one at age 5 or 6, in BASIC on a ZX-81 with 4K of RAM
The fact that a lot of these reactions are based on false info is worth noting. It doesn’t defeat any arguments directly, but it says that the naive model where everything happens because of the direct perception of actions I directly control is false.
That sounds like a pretty rare device! Most ZX81 models had either 1K or 16K of RAM. 32 KB and 64 KB expansion packs were eventually released too.
Sent you a PM on who said that.
Is it likely that someone who’s doing interesting work that’s publicly available wouldn’t attract some hostility?
That N negative reactions about issue S exist only means that issue S is sufficiently popular.
Not if the polling is of folk in a position to have had contact with S, or is representative.
I don’t like to, but if necessary I can provide the indentity of the people who stated the above. They all directly work to reduce x-risks. I won’t do so in public however.
Identity of these people is not the issue. The percentage of people in given category that have negative reactions for given reason, negative reactions for other reason, and positive reactions would be useful, but not a bunch of filtered (in unknown way) soldier-arguments.
I know. I however just wanted to highlight that there are negative reactions, including not so negative critique. If you look further, you’ll probably find more. I haven’t saved all I saw over the years, I just wanted to show that it’s not like nobody has a problem with EY. And in all ocassion I actually defended him by the way.
The context is also difficult to provide as some of it is from private e-Mails. Although the first one is from here and after thinking about it I can also provide the name since he was anyway telling this Michael Anissimov. It is from Sean Hays:
You have a ‘nasty things people say about Eliezer’ quotes file?
The last one was from David Pearce.
Sure, but XiXiDu’s quotes bear no such framing.
This seems a rather minor objection.
But frogs are CUTE!
And existential risks are boring, and only interest Sci-Fi nerds.