If the SIAI was a group of self interested/self deceiving individuals, similar to new age groups, who had made up all this stuff about rationality and FAI as a cover for fundraising what different observations would we expect?
4- Laud their accomplishments a lot without producing concrete results
5- Charge large amounts of money for classes/training
6- Censor dissent on official areas, refuse to even think about the possibility of being a cult, etc.
7- Not produce useful results
SIAI does not appear to fit 1 (I’m not sure what the standard is here), certainly does not fit 2 or 3, debatably fits 4, and certainly does not fit 5 or 6. 7 is highly debatable but I would argue that the Sequences and other rationality material are clearly valuable, if somewhat obtuse.
That goes for self interested individuals with high rationality, purely material goals, and very low self deception. The self deceived case, on the other hand, is the people whose self interest includes ‘feeling important’ and ‘believing oneself to be awesome’ and perhaps even ‘taking a shot at becoming the saviour of mankind’. In that case you should expect them to see awesomeness in anything that might possibly be awesome (various philosophy, various confused texts that might be becoming mainstream for all we know, you get the idea), combined with absence of anything that is definitely awesome and can’t be trivial (a new algorithmic solution to long standing well known problem that others worked on, practically important enough, etc).
I wouldn’t have expected them to hire Luke. If Luke was a member all along and everything just planned to make them look more convincing that would imply a level of competence at such things that I’d expect all round better execution (which would have helped more than slightly improved believability from faking lower level of PR etc competence).
What evidence have you? Lots of New Age practitioners claim that New Age practices work for them. Scientology does not allow members to claim levels of advancement until they attest to “wins”.
For my part, the single biggest influence that “their brand of rationality” (i.e. the Sequences) has had on me may very well be that I now know how to effectively disengage from dictionary arguments.
Even if certain rationality techniques are effective that’s separate from the claims about the rest of the organisation. Similar to the early level Scientology classes being useful social hacks but the overall structure less so.
the early level Scientology classes being useful social hacks
They are? Do you have a reference? I thought they were weird nonsense about pointing to things and repeating pairs of words and starting at corners of rooms and so on.
Markedly increased general satisfaction in life, better success at relationships, both intimate and otherwise, noticing systematic errors in thinking, etc.
I haven’t bothered to collect actual data (which wouldn’t do much good since I don’t have pre-LW data anyway) but I am at least twice as happy with my life as I have been in previous years.
This is the core issue with rationality at present. Until and unless some intrepid self data collectors track their personal lives post sequences then we have a collection of smart people who post nice anecdotes. I admit that, like you, I didn’t have the presence of mind to start collecting data as I can’t keep a diary current. But without real data we will have continued trouble convincing people that this works.
I was thinking the other day that I desperately wished I had written down my cached thoughts (and more importantly, cached feelings) about things like cryonics (in particular), politics, or [insert LW topic of choice here] before reading LW so that I could compare them now. I don’t think I had ever really thought about cryonics, or if I had, I had a node linking it to crazy people.
Actually, now that I think about it it’s not true. I remember thinking about it once when I first started in research, and we were unfreezing lab samples, and considering whether or not cryonicists have a point. I don’t remember what I felt about it though.
One of the useful things about the internet is it’s record keeping abilities and humans natural ability to comment on things they know nothing about. Are you aware of being on record on a forum or social media site pre LW on issues that LW has dealt with?
Yes, to an extent. I’ve had Facebook for about six years (I found HPMOR about 8 months ago, and LW about 7?) but I deleted the majority of easily accessible content and do not post anything particularly introspective on there. I know, generally, how I felt about more culturally popular memes, what I really wish I remember though are things like cryonics or the singularity, to which I never gave serious consideration before LW.
Edit: At one point, I wrote a program to click the “Older posts” button on Facebook so I could go back and read all of my old posts, but it’s been made largely obsolete by the timeline feature.
It’s probably a bit late for many attitudes of mine, but I have made a stab at this by keeping copies of all my YourMorals.org answers and listing other psychometric data at http://www.gwern.net/Links#profile
(And I’ve retrospectively listed in an essay the big shifts that I can remember; hopefully I can keep it up to date and obtain a fairly complete list over my life.)
Until and unless some intrepid self data collectors track their personal lives post sequences then we have a collection of smart people who post nice anecdotes
IIRC, wasn’t a bunch of data-collection done for the Bootcamp attendees, which was aimed at resolving precisely that issue?
Thought experiment
If the SIAI was a group of self interested/self deceiving individuals, similar to new age groups, who had made up all this stuff about rationality and FAI as a cover for fundraising what different observations would we expect?
I would expect them to:
1- Never hire anybody or hire only very rarely
2- Not release information about their finances
3- Avoid high-profile individuals or events
4- Laud their accomplishments a lot without producing concrete results
5- Charge large amounts of money for classes/training
6- Censor dissent on official areas, refuse to even think about the possibility of being a cult, etc.
7- Not produce useful results
SIAI does not appear to fit 1 (I’m not sure what the standard is here), certainly does not fit 2 or 3, debatably fits 4, and certainly does not fit 5 or 6. 7 is highly debatable but I would argue that the Sequences and other rationality material are clearly valuable, if somewhat obtuse.
That goes for self interested individuals with high rationality, purely material goals, and very low self deception. The self deceived case, on the other hand, is the people whose self interest includes ‘feeling important’ and ‘believing oneself to be awesome’ and perhaps even ‘taking a shot at becoming the saviour of mankind’. In that case you should expect them to see awesomeness in anything that might possibly be awesome (various philosophy, various confused texts that might be becoming mainstream for all we know, you get the idea), combined with absence of anything that is definitely awesome and can’t be trivial (a new algorithmic solution to long standing well known problem that others worked on, practically important enough, etc).
I wouldn’t have expected them to hire Luke. If Luke was a member all along and everything just planned to make them look more convincing that would imply a level of competence at such things that I’d expect all round better execution (which would have helped more than slightly improved believability from faking lower level of PR etc competence).
I would not expect their brand of rationality to work in my own life. Which it does.
What evidence have you? Lots of New Age practitioners claim that New Age practices work for them. Scientology does not allow members to claim levels of advancement until they attest to “wins”.
For my part, the single biggest influence that “their brand of rationality” (i.e. the Sequences) has had on me may very well be that I now know how to effectively disengage from dictionary arguments.
Even if certain rationality techniques are effective that’s separate from the claims about the rest of the organisation. Similar to the early level Scientology classes being useful social hacks but the overall structure less so.
They are? Do you have a reference? I thought they were weird nonsense about pointing to things and repeating pairs of words and starting at corners of rooms and so on.
Markedly increased general satisfaction in life, better success at relationships, both intimate and otherwise, noticing systematic errors in thinking, etc.
I haven’t bothered to collect actual data (which wouldn’t do much good since I don’t have pre-LW data anyway) but I am at least twice as happy with my life as I have been in previous years.
This is the core issue with rationality at present. Until and unless some intrepid self data collectors track their personal lives post sequences then we have a collection of smart people who post nice anecdotes. I admit that, like you, I didn’t have the presence of mind to start collecting data as I can’t keep a diary current. But without real data we will have continued trouble convincing people that this works.
I was thinking the other day that I desperately wished I had written down my cached thoughts (and more importantly, cached feelings) about things like cryonics (in particular), politics, or [insert LW topic of choice here] before reading LW so that I could compare them now. I don’t think I had ever really thought about cryonics, or if I had, I had a node linking it to crazy people.
Actually, now that I think about it it’s not true. I remember thinking about it once when I first started in research, and we were unfreezing lab samples, and considering whether or not cryonicists have a point. I don’t remember what I felt about it though.
One of the useful things about the internet is it’s record keeping abilities and humans natural ability to comment on things they know nothing about. Are you aware of being on record on a forum or social media site pre LW on issues that LW has dealt with?
Useful and harmful. ;-)
Yes, to an extent. I’ve had Facebook for about six years (I found HPMOR about 8 months ago, and LW about 7?) but I deleted the majority of easily accessible content and do not post anything particularly introspective on there. I know, generally, how I felt about more culturally popular memes, what I really wish I remember though are things like cryonics or the singularity, to which I never gave serious consideration before LW.
Edit: At one point, I wrote a program to click the “Older posts” button on Facebook so I could go back and read all of my old posts, but it’s been made largely obsolete by the timeline feature.
It’s probably a bit late for many attitudes of mine, but I have made a stab at this by keeping copies of all my YourMorals.org answers and listing other psychometric data at http://www.gwern.net/Links#profile
(And I’ve retrospectively listed in an essay the big shifts that I can remember; hopefully I can keep it up to date and obtain a fairly complete list over my life.)
IIRC, wasn’t a bunch of data-collection done for the Bootcamp attendees, which was aimed at resolving precisely that issue?