This made me chuckle. I suppose that as the intelligence and amount of knowledge held by the average member of an intellectual group goes up, their lower bounds on the amount of knowledge someone must have in order for that member to have an intellectual conversation with them goes up as well.
I’m horrible at communicating clearly, so I’ll give an example.
4chan poster: You’re a scientologist!? Idiot.
RationalWiki member: You’re a creationist?! I refuse to speak to you.
Less Wrong member: You insist that there is such thing as waveform collapse in quantum mechanics?! I see you cannot be saved.
I’m slowly waking up to the fact that people at the Singularity Institute as well as Less Wrong are dealing with existential risk as a Real Problem, not just a theoretical idea to play with in an academic way. I’ve read many essays and watched many videos, but the seriousness just never really hit my brain. For some reason I had never realized that people were actually working on these problems.
I’m an 18 year old recent high school dropout, about to nab my GED. I could go to community college, or I could go along with my plan of leading a simple life working a simple job, which I would be content doing. I’m a sort of tabla rossa here: if I wanted to get into the position where I would be of use to the SIAI, what skills should I develop? Which of the ‘What we’re looking for’ traits would be most useful in a few years? (The only thing I’m good at right now is reading very quickly and retaining large amounts of information about various fields: but I rarely understand the math, which is currently very limiting.)