I’m reading a series of blogposts called Extropia’s Children about the Extropians mailing list and its effects on the rationalist and EA communities. It seems quite good although a bit negative at times.
In my opinion, it is clickbait, but I didn’t notice any falsehoods (I didn’t check carefully, just skimmed).
For people familiar with the rationalist community, it is a good reminder of bad things that happened. For people unfamiliar with the community… it will probably make them believe that the rationalist community consists mostly of Leverage, neoreaction, and Zizians.
Seems reasonable. Though I will note that bad things that happened are a significant fraction of the community early on, so people who read sections 1-3 with the reminder that it’s a focus on the negative will probably not get the wrong idea.
I did notice a misleading section which might indicate there are several more:
MIRI tends to portray [TDT/UDT/FDT] as technical, practical, immediately applicable work. Others, even EA/rationalists, describe it as “philosophical decision theory” which “has gained significantly fewer advocates among professional philosophers than I’d expect it to if it were very promising.”
Both sentences are wrong. I don’t think MIRI portrays any agent foundations as “practical, immediately applicable work”, and in fact the linked post by Demski is listing some basic theoretical problems.
The quote by Daniel Dewey is taken out of context: he’s investigated their decision theory work in depth by talking to other professional philosophers, and found it to be promising, so the claim that it has “significantly fewer advocates among professional philosophers than I’d expect it to if it were very promising” is a claim that professional philosophers won’t automatically advocate for a promising idea, not that the work is unpromising.
bad things that happened are a significant fraction of the community early on
I’d like to read an impartial account, which would specify how large each fraction actually was.
For instance, if I remember correctly, in some survey 2% of Less Wrong readers identified as neoreactionaries. From some perspective, 2% is too much, because the only acceptable number is 0%. From a different perspective, 2% is less than the Lizardman’s Constant. Also, if I remember correctly, a much larger fraction of LessWrong readership identified on the survey as communist, and yet for some reason there are no people writing blogs or Wikipedia articles about how Less Wrong is a communist website. Or a socialist website. Or a Democrat website. Or… whatever else was in the poll.
The section on Zizians is weird, because it correctly starts with saying that Zizians opposed MIRI and CFAR… and yet concludes that the this is evidence that people attracted to rationalism are disproportionately prone to death spirals off the deep end. Notice the sleight of hand: “people attracted to you” technically includes your enemies who can’t stop thinking about you. -- Using the same rhetorical trick: Westboro Baptist Church is evidence that people attracted to (the topic of) homosexuality are often crazy. Also, by the same logic, every celebrity is responsible for her stalkers.
There are cases when the rationalist community actually promoted harmful people and groups, such as Vassar or Leverage. I’d like to read a serious analysis of how and why that happened, and how to prevent something like that in future. But if another Ziz appears in future, and starts recruiting people in another crazy cult opposed to rationalists, I am not sure how exactly to prevent that.
I’m reading a series of blogposts called Extropia’s Children about the Extropians mailing list and its effects on the rationalist and EA communities. It seems quite good although a bit negative at times.
In my opinion, it is clickbait, but I didn’t notice any falsehoods (I didn’t check carefully, just skimmed).
For people familiar with the rationalist community, it is a good reminder of bad things that happened. For people unfamiliar with the community… it will probably make them believe that the rationalist community consists mostly of Leverage, neoreaction, and Zizians.
Seems reasonable. Though I will note that bad things that happened are a significant fraction of the community early on, so people who read sections 1-3 with the reminder that it’s a focus on the negative will probably not get the wrong idea.
I did notice a misleading section which might indicate there are several more:
Both sentences are wrong. I don’t think MIRI portrays any agent foundations as “practical, immediately applicable work”, and in fact the linked post by Demski is listing some basic theoretical problems.
The quote by Daniel Dewey is taken out of context: he’s investigated their decision theory work in depth by talking to other professional philosophers, and found it to be promising, so the claim that it has “significantly fewer advocates among professional philosophers than I’d expect it to if it were very promising” is a claim that professional philosophers won’t automatically advocate for a promising idea, not that the work is unpromising.
I’d like to read an impartial account, which would specify how large each fraction actually was.
For instance, if I remember correctly, in some survey 2% of Less Wrong readers identified as neoreactionaries. From some perspective, 2% is too much, because the only acceptable number is 0%. From a different perspective, 2% is less than the Lizardman’s Constant. Also, if I remember correctly, a much larger fraction of LessWrong readership identified on the survey as communist, and yet for some reason there are no people writing blogs or Wikipedia articles about how Less Wrong is a communist website. Or a socialist website. Or a Democrat website. Or… whatever else was in the poll.
The section on Zizians is weird, because it correctly starts with saying that Zizians opposed MIRI and CFAR… and yet concludes that the this is evidence that people attracted to rationalism are disproportionately prone to death spirals off the deep end. Notice the sleight of hand: “people attracted to you” technically includes your enemies who can’t stop thinking about you. -- Using the same rhetorical trick: Westboro Baptist Church is evidence that people attracted to (the topic of) homosexuality are often crazy. Also, by the same logic, every celebrity is responsible for her stalkers.
There are cases when the rationalist community actually promoted harmful people and groups, such as Vassar or Leverage. I’d like to read a serious analysis of how and why that happened, and how to prevent something like that in future. But if another Ziz appears in future, and starts recruiting people in another crazy cult opposed to rationalists, I am not sure how exactly to prevent that.